Aggregate Medians [wbburgin]This indicator recursively finds the average of all high/low medians under your chosen length. This can be very, very helpful for analyzing trends where a moving average or a normal median would produce a bunch of false signals.
Settings:
The "Length" setting is the maximum median that you want the algorithm to add into the sum. The "Start at Period" setting is the the minimum median that you want the algorithm to take into account. Starting at a higher period means that the faster, more sensitive medians of lower lengths are not included, and will smooth out your curve.
I haven't seen many recursive algorithms on TradingView so feel free to use this script as inspiration for any of your ideas. In theory, you can essentially replace the median function with any other function - a moving average, a supertrend, or anything else.
The start must be lower than the length, because this is a sum from the start to the length of all medians in between.
Buscar en scripts para "algo"
MLExtensionsLibrary "MLExtensions"
normalizeDeriv(src, quadraticMeanLength)
Returns the smoothed hyperbolic tangent of the input series.
Parameters:
src : The input series (i.e., the first-order derivative for price).
quadraticMeanLength : The length of the quadratic mean (RMS).
Returns: nDeriv The normalized derivative of the input series.
normalize(src, min, max)
Rescales a source value with an unbounded range to a target range.
Parameters:
src : The input series
min : The minimum value of the unbounded range
max : The maximum value of the unbounded range
Returns: The normalized series
rescale(src, oldMin, oldMax, newMin, newMax)
Rescales a source value with a bounded range to anther bounded range
Parameters:
src : The input series
oldMin : The minimum value of the range to rescale from
oldMax : The maximum value of the range to rescale from
newMin : The minimum value of the range to rescale to
newMax : The maximum value of the range to rescale to
Returns: The rescaled series
color_green(prediction)
Assigns varying shades of the color green based on the KNN classification
Parameters:
prediction : Value (int|float) of the prediction
Returns: color
color_red(prediction)
Assigns varying shades of the color red based on the KNN classification
Parameters:
prediction : Value of the prediction
Returns: color
tanh(src)
Returns the the hyperbolic tangent of the input series. The sigmoid-like hyperbolic tangent function is used to compress the input to a value between -1 and 1.
Parameters:
src : The input series (i.e., the normalized derivative).
Returns: tanh The hyperbolic tangent of the input series.
dualPoleFilter(src, lookback)
Returns the smoothed hyperbolic tangent of the input series.
Parameters:
src : The input series (i.e., the hyperbolic tangent).
lookback : The lookback window for the smoothing.
Returns: filter The smoothed hyperbolic tangent of the input series.
tanhTransform(src, smoothingFrequency, quadraticMeanLength)
Returns the tanh transform of the input series.
Parameters:
src : The input series (i.e., the result of the tanh calculation).
smoothingFrequency
quadraticMeanLength
Returns: signal The smoothed hyperbolic tangent transform of the input series.
n_rsi(src, n1, n2)
Returns the normalized RSI ideal for use in ML algorithms.
Parameters:
src : The input series (i.e., the result of the RSI calculation).
n1 : The length of the RSI.
n2 : The smoothing length of the RSI.
Returns: signal The normalized RSI.
n_cci(src, n1, n2)
Returns the normalized CCI ideal for use in ML algorithms.
Parameters:
src : The input series (i.e., the result of the CCI calculation).
n1 : The length of the CCI.
n2 : The smoothing length of the CCI.
Returns: signal The normalized CCI.
n_wt(src, n1, n2)
Returns the normalized WaveTrend Classic series ideal for use in ML algorithms.
Parameters:
src : The input series (i.e., the result of the WaveTrend Classic calculation).
n1
n2
Returns: signal The normalized WaveTrend Classic series.
n_adx(highSrc, lowSrc, closeSrc, n1)
Returns the normalized ADX ideal for use in ML algorithms.
Parameters:
highSrc : The input series for the high price.
lowSrc : The input series for the low price.
closeSrc : The input series for the close price.
n1 : The length of the ADX.
regime_filter(src, threshold, useRegimeFilter)
Parameters:
src
threshold
useRegimeFilter
filter_adx(src, length, adxThreshold, useAdxFilter)
filter_adx
Parameters:
src : The source series.
length : The length of the ADX.
adxThreshold : The ADX threshold.
useAdxFilter : Whether to use the ADX filter.
Returns: The ADX.
filter_volatility(minLength, maxLength, useVolatilityFilter)
filter_volatility
Parameters:
minLength : The minimum length of the ATR.
maxLength : The maximum length of the ATR.
useVolatilityFilter : Whether to use the volatility filter.
Returns: Boolean indicating whether or not to let the signal pass through the filter.
backtest(high, low, open, startLongTrade, endLongTrade, startShortTrade, endShortTrade, isStopLossHit, maxBarsBackIndex, thisBarIndex)
Performs a basic backtest using the specified parameters and conditions.
Parameters:
high : The input series for the high price.
low : The input series for the low price.
open : The input series for the open price.
startLongTrade : The series of conditions that indicate the start of a long trade.`
endLongTrade : The series of conditions that indicate the end of a long trade.
startShortTrade : The series of conditions that indicate the start of a short trade.
endShortTrade : The series of conditions that indicate the end of a short trade.
isStopLossHit : The stop loss hit indicator.
maxBarsBackIndex : The maximum number of bars to go back in the backtest.
thisBarIndex : The current bar index.
Returns: A tuple containing backtest values
init_table()
init_table()
Returns: tbl The backtest results.
update_table(tbl, tradeStatsHeader, totalTrades, totalWins, totalLosses, winLossRatio, winrate, stopLosses)
update_table(tbl, tradeStats)
Parameters:
tbl : The backtest results table.
tradeStatsHeader : The trade stats header.
totalTrades : The total number of trades.
totalWins : The total number of wins.
totalLosses : The total number of losses.
winLossRatio : The win loss ratio.
winrate : The winrate.
stopLosses : The total number of stop losses.
Returns: Updated backtest results table.
WaveTrend 3D█ OVERVIEW
WaveTrend 3D (WT3D) is a novel implementation of the famous WaveTrend (WT) indicator and has been completely redesigned from the ground up to address some of the inherent shortcomings associated with the traditional WT algorithm.
█ BACKGROUND
The WaveTrend (WT) indicator has become a widely popular tool for traders in recent years. WT was first ported to PineScript in 2014 by the user @LazyBear, and since then, it has ascended to become one of the Top 5 most popular scripts on TradingView.
The WT algorithm appears to have origins in a lesser-known proprietary algorithm called Trading Channel Index (TCI), created by AIQ Systems in 1986 as an integral part of their commercial software suite, TradingExpert Pro. The software’s reference manual states that “TCI identifies changes in price direction” and is “an adaptation of Donald R. Lambert’s Commodity Channel Index (CCI)”, which was introduced to the world six years earlier in 1980. Interestingly, a vestige of this early beginning can still be seen in the source code of LazyBear’s script, where the final EMA calculation is stored in an intermediate variable called “tci” in the code.
█ IMPLEMENTATION DETAILS
WaveTrend 3D is an alternative implementation of WaveTrend that directly addresses some of the known shortcomings of the indicator, including its unbounded extremes, susceptibility to whipsaw, and lack of insight into other timeframes.
In the canonical WT approach, an exponential moving average (EMA) for a given lookback window is used to assess the variability between price and two other EMAs relative to a second lookback window. Since the difference between the average price and its associated EMA is essentially unbounded, an arbitrary scaling factor of 0.015 is typically applied as a crude form of rescaling but still fails to capture 20-30% of values between the range of -100 to 100. Additionally, the trigger signal for the final EMA (i.e., TCI) crossover-based oscillator is a four-bar simple moving average (SMA), which further contributes to the net lag accumulated by the consecutive EMA calculations in the previous steps.
The core idea behind WT3D is to replace the EMA-based crossover system with modern Digital Signal Processing techniques. By assuming that price action adheres approximately to a Gaussian distribution, it is possible to sidestep the scaling nightmare associated with unbounded price differentials of the original WaveTrend method by focusing instead on the alteration of the underlying Probability Distribution Function (PDF) of the input series. Furthermore, using a signal processing filter such as a Butterworth Filter, we can eliminate the need for consecutive exponential moving averages along with the associated lag they bring.
Ideally, it is convenient to have the resulting probability distribution oscillate between the values of -1 and 1, with the zero line serving as a median. With this objective in mind, it is possible to borrow a common technique from the field of Machine Learning that uses a sigmoid-like activation function to transform our data set of interest. One such function is the hyperbolic tangent function (tanh), which is often used as an activation function in the hidden layers of neural networks due to its unique property of ensuring the values stay between -1 and 1. By taking the first-order derivative of our input series and normalizing it using the quadratic mean, the tanh function performs a high-quality redistribution of the input signal into the desired range of -1 to 1. Finally, using a dual-pole filter such as the Butterworth Filter popularized by John Ehlers, excessive market noise can be filtered out, leaving behind a crisp moving average with minimal lag.
Furthermore, WT3D expands upon the original functionality of WT by providing:
First-class support for multi-timeframe (MTF) analysis
Kernel-based regression for trend reversal confirmation
Various options for signal smoothing and transformation
A unique mode for visualizing an input series as a symmetrical, three-dimensional waveform useful for pattern identification and cycle-related analysis
█ SETTINGS
This is a summary of the settings used in the script listed in roughly the order in which they appear. By default, all default colors are from Google's TensorFlow framework and are considered to be colorblind safe.
Source: The input series. Usually, it is the close or average price, but it can be any series.
Use Mirror: Whether to display a mirror image of the source series; for visualizing the series as a 3D waveform similar to a soundwave.
Use EMA: Whether to use an exponential moving average of the input series.
EMA Length: The length of the exponential moving average.
Use COG: Whether to use the center of gravity of the input series.
COG Length: The length of the center of gravity.
Speed to Emphasize: The target speed to emphasize.
Width: The width of the emphasized line.
Display Kernel Moving Average: Whether to display the kernel moving average of the signal. Like PCA, an unsupervised Machine Learning technique whereby neighboring vectors are projected onto the Principal Component.
Display Kernel Signal: Whether to display the kernel estimator for the emphasized line. Like the Kernel MA, it can show underlying shifts in bias within a more significant trend by the colors reflected on the ribbon itself.
Show Oscillator Lines: Whether to show the oscillator lines.
Offset: The offset of the emphasized oscillator plots.
Fast Length: The length scale factor for the fast oscillator.
Fast Smoothing: The smoothing scale factor for the fast oscillator.
Normal Length: The length scale factor for the normal oscillator.
Normal Smoothing: The smoothing scale factor for the normal frequency.
Slow Length: The length scale factor for the slow oscillator.
Slow Smoothing: The smoothing scale factor for the slow frequency.
Divergence Threshold: The number of bars for the divergence to be considered significant.
Trigger Wave Percent Size: How big the current wave should be relative to the previous wave.
Background Area Transparency Factor: Transparency factor for the background area.
Foreground Area Transparency Factor: Transparency factor for the foreground area.
Background Line Transparency Factor: Transparency factor for the background line.
Foreground Line Transparency Factor: Transparency factor for the foreground line.
Custom Transparency: Transparency of the custom colors.
Total Gradient Steps: The maximum amount of steps supported for a gradient calculation is 256.
Fast Bullish Color: The color of the fast bullish line.
Normal Bullish Color: The color of the normal bullish line.
Slow Bullish Color: The color of the slow bullish line.
Fast Bearish Color: The color of the fast bearish line.
Normal Bearish Color: The color of the normal bearish line.
Slow Bearish Color: The color of the slow bearish line.
Bullish Divergence Signals: The color of the bullish divergence signals.
Bearish Divergence Signals: The color of the bearish divergence signals.
█ ACKNOWLEDGEMENTS
@LazyBear - For authoring the original WaveTrend port on TradingView
@PineCoders - For the beautiful color gradient framework used in this indicator
@veryfid - For the inspiration of using mirrored signals for cycle analysis and using multiple lookback windows as proxies for other timeframes
Improved Chaikin Money FlowChaikin Money Flow is a well-known Indicator for gauging buying/selling pressure. Marc Chaikin intended this to be used on the daily timeframe to capture the behavior of price action at or near the daily close when larger-scale actors influence the market. The calculation is straight forward as described within the built-in TradingView "CMF" indicator:
1. Period Money Flow Multiplier = ((Close - Low) - (High - Close)) /(High - Low)
2. Period Money Flow Volume = Period Money Flow Multiplier x Volume for the Period
3. Chaikin Money Flow = 21 Period Sum of Money Flow Volume / 21 Period Sum of Volume
There is, however, a problem with this algorithm: it does not account for daily gaps in price action. This leads to the indicator sometimes moving out-of-sync with price action and/or an under-emphasis of the magnitude change of the indicator relative to the change in price action. This is a significant problem for someone trying to read divergences against an underlying.
Note: I have never seen a published attempt to improve this indicator which is why I decided that there had to be a way to do it.
In order to mitigate this issue, I have taken the basic script provided by TradingView and made a key modification. If the open of a candle is outside the range of the previous candle, then the close of the previous candle is used as the "high" for the current candle (in the case of a gap down) or the "low" for the current candle (in the case of a gap up). However, if the close of the current candle exceeds the previous close, highs and lows for the current candle are calculated as normal. I believe this accounts for gaps in price action without significantly altering the original intent of the indicator.
I have made four other minor tweaks:
1. Default style is color coded area above and below the Zero Line
2. Range scaled to +/-100 instead of +/-1 (displays better on graph)
3. Set timeframe to Daily (as that is the timeframe for which this indicator was intended by Chaikin)
4. Length defaults to 21 (which is what Chaikin uses)
Extreme Trend Reversal Points [HeWhoMustNotBeNamed]Using moving average crossover for identifying the change in trend is very common. However, this method can give lots of false signals during the ranging markets. In this algorithm, we try to find the extreme trend by looking at fully aligned multi-level moving averages and only look at moving average crossover when market is in the extreme trend - either bullish or bearish. These points can mean long term downtrend or can also cause a small pullback before trend continuation. In this discussion, we will also check how to handle different scenarios.
🎲 Components
🎯 Recursive Multi Level Moving Averages
Multi level moving average here refers to applying moving average on top of base moving average on multiple levels. For example,
Level 1 SMA = SMA(source, length)
Level 2 SMA = SMA(Level 1 SMA, length)
Level 3 SMA = SMA(Level 2 SMA, length)
..
..
..
Level n SMA = SMA(Level (n-1) SMA, length)
In this script, user can select how many levels of moving averages need to be calculated. This is achieved through " recursive moving average " algorithm. Requirement for building such algorithm was initially raised by @loxx
While I was able to develop them in minimal code with the help of some of the existing libraries built on arrays and matrix , I also thought why not extend this to find something interesting.
Note that since we are using variable levels - we will not be able to plot all the levels of moving average. (This is because plotting cannot be done in the loop). Hence, we are using lines to display the latest moving average levels in front of the last candle. Lines are color coded in such a way that least numbered levels are greener and higher levels are redder.
🎯 Finding the trend and range
Strength of fully aligned moving average is calculated based on position of each level with respect to other levels.
For example, in a complete uptrend, we can find
source > L(1)MA > L(2)MA > L(3)MA ...... > L(n-1)MA > L(n)MA
Similarly in a complete downtrend, we can find
source < L(1)MA < L(2)MA < L(3)MA ...... < L(n-1)MA < L(n)MA
Hence, the strength of trend here is calculated based on relative positions of each levels. Due to this, value of strength can range from 0 to Level*(Level-1)/2
0 represents the complete downtrend
Level*(Level-1)/2 represents the complete uptrend.
Range and Extreme Range are calculated based on the percentile from median. The brackets are defined as per input parameters - Range Percentile and Extreme Range Percentile by using Percentile History as reference length.
Moving average plot is color coded to display the trend strength.
Green - Extreme Bullish
Lime - Bullish
Silver - range
Orange - Bearish
Red - Extreme Bearish
🎯 Finding the trend reversal
Possible trend reversals are when price crosses the moving average while in complete trend with all the moving averages fully aligned. Triangle marks are placed in such locations which can help observe the probable trend reversal points. But, there are possibilities of trend overriding these levels. An example of such thing, we can see here:
In order to overcome this problem, we can employ few techniques.
1. After the signal, wait for trend reversal (moving average plot color to turn silver) before placing your order.
2. Place stop orders on immediate pivot levels or support resistance points instead of opening market order. This way, we can also place an order in the direction of trend. Whichever side the price breaks out, will be the direction to trade.
3. Look for other confirmations such as extremely bullish and bearish candles before placing the orders.
🎯 An example of using stop orders
Let us take this scenario where there is a signal on possible reversal from complete uptrend.
Create a box joining high and low pivots at reasonable distance. You can also chose to add 1 ATR additional distance from pivots.
Use the top of the box as stop-entry for long and bottom as stop-entry for short. The other ends of the box can become stop-losses for each side.
After few bars, we can see that few more signals are plotted but, the price is still within the box. There are some candles which touched the top of the box. But, the candlestick patterns did not represent bullishness on those instances. If you have placed stop orders, these orders would have already filled in. In that case, just wait for position to hit either stop or target.
For bullish side, targets can be placed at certain risk reward levels. In this case, we just use 1:1 for bullish (trend side) and 1:1.5 for bearish side (reversal side)
In this case, price hit the target without any issue:
Wait for next reversal signal to appear before placing another order :)
American Approximation Bjerksund & Stensland 2002 [Loxx]American Approximation Bjerksund & Stensland 2002 is an American Options pricing model. This indicator also includes numerical greeks. You can compare the output of the American Approximation to the Black-Scholes-Merton value on the output of the options panel.
The Bjerksund & Stensland (2002) Approximation
The Bjerksund and Stensland (2002) approximation divides the time to maturity into two parts, each with a separate flat exercise boundary. It is thus a straightforward generalization of the Bjerksund-Stensland 1993 algorithm. The method is fast and efficient and should be more accurate than the Barone-Adesi and Whaley (1987) and the Bjerksund and Stensland (1993b) approximations. The algorithm requires an accurate cumulative bivariate normal approximation. Several approximations that are described in the literature are not sufficiently accurate, but the Genze algorithm works.
C = alpha2*S^B - alpha2*phi(S, t1, B, I2, I2)
+ phi(S, t1, I2, I2) - phi(S, t1, I, I1, I2)
- X*phi(S, t1, 0, I2, I2) + X*phi(S, t1, 0, I1, I2)
+ alpha1*phi(X, t1, B, I1, I2) - alpha1*psi*St, T, B, I1, I2, I1, t1)
+ psi(S, T, 1, I1, I2, I1, t1) - psi(S, T, 1, X, I2, I1, t1)
- X*psi(S, T, 0, I1, I2, I1, t1) + psi(S, T, 0 ,X, I2, I1, t1)
where
alpha1 = (I1 - X)*I1^-B
alpha2 = (I2 - X)*I2^-B
B = (1/2 - b/v^2) + ((b/v^2 - 1/2)^2 + 2*(r/v^2))^0.5
The function psi(S, T, y, H, I) is given by
psi(S, T, gamma, H, I) = e^lambda * S^gamma * (N(-d) - (I/S)^k * N(-d2))
d = (log(S/H) + (b + (gamma - 1/2) * v^2) * T) / (v * T^0.5)
d2 = (log(I^2/(S*H)) + (b + (gamma - 1/2) * v^2) * T) / (v * T^0.5)
lambda = -r + gamma * b + 1/2 * gamma * (gamma - 1) * v^2
k = 2*b/v^2 + (2 * gamma - 1)
and the trigger price I is defined as
I1 = B0 + (B(+infi) - B0) * (1 - e^h1)
I2 = B0 + (B(+infi) - B0) * (1 - e^h2)
h1 = -(b*t1 + 2*v*t1^0.5) * (X^2 / ((B(+infi) - B0))*B0)
h2 = -(b*T + 2*v*T^0.5) * (X^2 / ((B(+infi) - B0))*B0)
t1 = 1/2 * (5^0.5 - 1) * T
B(+infi) = (B / (B - 1)) * X
B0 = max(X, (r / (r - b)) * X)
Moreover, the function psi(S, T, gamma, H, I2, I1, t1) is given by
psi(S, T, gamma, H, I2, I1, t1, r, b, v) = e^(lambda * T) * S^gamma * (M(-e1, -f1, rho) - (I2/S)^k * M(-e2, -f2, rho)
- (I1/S)^k * M(-e3, -f3, -rho) + (I1/I2)^k * M(-e4, -f4, -rho))
where (see screenshot for e and f values)
b=r options on non-dividend paying stock
b=r-q options on stock or index paying a dividend yield of q
b=0 options on futures
b=r-rf currency options (where rf is the rate in the second currency)
Inputs
S = Stock price.
K = Strike price of option.
T = Time to expiration in years.
r = Risk-free rate
c = Cost of Carry
V = Variance of the underlying asset price
cnd1(x) = Cumulative Normal Distribution
cbnd3(x) = Cumulative Bivariate Normal Distribution
nd(x) = Standard Normal Density Function
convertingToCCRate(r, cmp ) = Rate compounder
Numerical Greeks or Greeks by Finite Difference
Analytical Greeks are the standard approach to estimating Delta, Gamma etc... That is what we typically use when we can derive from closed form solutions. Normally, these are well-defined and available in text books. Previously, we relied on closed form solutions for the call or put formulae differentiated with respect to the Black Scholes parameters. When Greeks formulae are difficult to develop or tease out, we can alternatively employ numerical Greeks - sometimes referred to finite difference approximations. A key advantage of numerical Greeks relates to their estimation independent of deriving mathematical Greeks. This could be important when we examine American options where there may not technically exist an exact closed form solution that is straightforward to work with. (via VinegarHill FinanceLabs)
Things to know
Only works on the daily timeframe and for the current source price.
You can adjust the text size to fit the screen
Nadaraya-Watson CombineThis is a combination of the Lux Algo Nadaraya-Watson Estimator and Envelope. Please note the repainting issue.
In addition, I've added a plot of the actual values of the current barstate of
the Nadaraya-Watson windows as they are computed (lines 92-95). It only plots values for the current data at
each time update. It is interesting to compare the trajectory of the end points of the Estimator and
Envelope to the smoothing function at each time update. Due to the kernel smoothing at each update the
history is lost at each update (repaint).
I've added a feature to allow adjustment to the kernel smoothing algorithm as suggested by thomsonraja (line 59).
The settings and usage are repeated from Lux Algo below.
Settings
Window Size: Determines the number of recent price observations to be used to fit the Nadaraya-Watson Estimator.
Bandwidth: Controls the degree of smoothness of the envelopes , with higher values returning smoother results.
Mult: Controls the envelope width.
Src: Input source of the indicator.
Kernel power: See line 59, adjusts the exponential power (powh) as suggested by thomsonraja
Kernel denominator: See line 59, adjusts the denominator (den) as suggested by thomsonraja
Usage
This tool outlines extremes made by the prices within the selected window size.
This is achieved by estimating the underlying trend in the price using kernel smoothing,
calculating the mean absolute deviations from it, and adding/subtracting it
from the estimated underlying trend.
I repeat Lux Algo's caution: 'we do not recommend this tool to be used alone
or solely for real time applications.'
End-Pointed SSA of Normalized Price Corridor [Loxx]End-Pointed SSA of Normalized Price Corridor is an end-pointed SSA of normalized input price to output a smoothed normalized oscillator of price. Corridors are added in attempt to decipher larger trend direction of price. These corridor trend lines are based on highs and lows of price. Due to the SSA algorithm, this indicator takes some time load on the chat, so be patient. You can adjust the lag parameter downward to speed up the indicator load time but this will also degrade the signal. There are many different ways to use this indicator. It is also Renko chart friendly.
An example of emerging trends (these do not repaint)
What is Singular Spectrum Analysis ( SSA )?
Singular spectrum analysis ( SSA ) is a technique of time series analysis and forecasting. It combines elements of classical time series analysis, multivariate statistics, multivariate geometry, dynamical systems and signal processing. SSA aims at decomposing the original series into a sum of a small number of interpretable components such as a slowly varying trend, oscillatory components and a ‘structureless’ noise. It is based on the singular value decomposition ( SVD ) of a specific matrix constructed upon the time series. Neither a parametric model nor stationarity-type conditions have to be assumed for the time series. This makes SSA a model-free method and hence enables SSA to have a very wide range of applicability.
For our purposes here, we are only concerned with the "Caterpillar" SSA . This methodology was developed in the former Soviet Union independently (the ‘iron curtain effect’) of the mainstream SSA . The main difference between the main-stream SSA and the "Caterpillar" SSA is not in the algorithmic details but rather in the assumptions and in the emphasis in the study of SSA properties. To apply the mainstream SSA , one often needs to assume some kind of stationarity of the time series and think in terms of the "signal plus noise" model (where the noise is often assumed to be ‘red’). In the "Caterpillar" SSA , the main methodological stress is on separability (of one component of the series from another one) and neither the assumption of stationarity nor the model in the form "signal plus noise" are required.
"Caterpillar" SSA
The basic "Caterpillar" SSA algorithm for analyzing one-dimensional time series consists of:
Transformation of the one-dimensional time series to the trajectory matrix by means of a delay procedure (this gives the name to the whole technique);
Singular Value Decomposition of the trajectory matrix;
Reconstruction of the original time series based on a number of selected eigenvectors.
This decomposition initializes forecasting procedures for both the original time series and its components. The method can be naturally extended to multidimensional time series and to image processing.
The method is a powerful and useful tool of time series analysis in meteorology, hydrology, geophysics, climatology and, according to our experience, in economics, biology, physics, medicine and other sciences; that is, where short and long, one-dimensional and multidimensional, stationary and non-stationary, almost deterministic and noisy time series are to be analyzed.
Included
Bar coloring
Signals
Alerts
Loxx's Expanded Source Types
End-pointed SSA of Williams %R [Loxx]End-pointed SSA of Williams %R is an indicator that runes Williams %R SSA calculation through a Singular Spectrum Analysis (SSA) algorithm to derive a smoother final output. The reduction in noise from the traditional Williams %R is significant.
What is Williams %R?
Williams %R , also known as the Williams Percent Range, is a type of momentum indicator that moves between 0 and -100 and measures overbought and oversold levels. The Williams %R may be used to find entry and exit points in the market. The indicator is very similar to the Stochastic oscillator and is used in the same way. It was developed by Larry Williams and it compares a stock’s closing price to the high-low range over a specific period, typically 14 days or periods.
What is Singular Spectrum Analysis ( SSA )?
Singular spectrum analysis ( SSA ) is a technique of time series analysis and forecasting. It combines elements of classical time series analysis, multivariate statistics, multivariate geometry, dynamical systems and signal processing. SSA aims at decomposing the original series into a sum of a small number of interpretable components such as a slowly varying trend, oscillatory components and a ‘structureless’ noise. It is based on the singular value decomposition ( SVD ) of a specific matrix constructed upon the time series. Neither a parametric model nor stationarity-type conditions have to be assumed for the time series. This makes SSA a model-free method and hence enables SSA to have a very wide range of applicability.
For our purposes here, we are only concerned with the "Caterpillar" SSA . This methodology was developed in the former Soviet Union independently (the ‘iron curtain effect’) of the mainstream SSA . The main difference between the main-stream SSA and the "Caterpillar" SSA is not in the algorithmic details but rather in the assumptions and in the emphasis in the study of SSA properties. To apply the mainstream SSA , one often needs to assume some kind of stationarity of the time series and think in terms of the "signal plus noise" model (where the noise is often assumed to be ‘red’). In the "Caterpillar" SSA , the main methodological stress is on separability (of one component of the series from another one) and neither the assumption of stationarity nor the model in the form "signal plus noise" are required.
"Caterpillar" SSA
The basic "Caterpillar" SSA algorithm for analyzing one-dimensional time series consists of:
Transformation of the one-dimensional time series to the trajectory matrix by means of a delay procedure (this gives the name to the whole technique);
Singular Value Decomposition of the trajectory matrix;
Reconstruction of the original time series based on a number of selected eigenvectors.
This decomposition initializes forecasting procedures for both the original time series and its components. The method can be naturally extended to multidimensional time series and to image processing.
The method is a powerful and useful tool of time series analysis in meteorology, hydrology, geophysics, climatology and, according to our experience, in economics, biology, physics, medicine and other sciences; that is, where short and long, one-dimensional and multidimensional, stationary and non-stationary, almost deterministic and noisy time series are to be analyzed.
Included:
Bar coloring
[*Alerts
[*Signals
[*Loxx's Expanded Source Types
Related Williams %R Indicators
Williams %R on Chart w/ Dynamic Zones
Williams %R w/ Bollinger Bands
Intermediate Williams %R w/ Discontinued Signal Lines
Related SSA Indicators
End-pointed SSA of FDASMA
End-pointed SSA of Normalized Price Oscillator
SSA of Price [Loxx]SSA of Price ris an indicator that runs an SSA calculation on price to derive final output. This indicator also serves to introduce the concept of SSA to the Pine Coder community. The data returned from this algorithm is an array of modeled values on past X bars. Unlike the end-pointed SSA posted previously, this version pulls the modeled data from the output array and draws a line backward from the current bar. This indicator recalculates so past observations aren't very useful, but the current observation is since the current bar is index 0 of the output array which means it's the endpointed value.
What is Singular Spectrum Analysis ( SSA )?
Singular spectrum analysis ( SSA ) is a technique of time series analysis and forecasting. It combines elements of classical time series analysis, multivariate statistics, multivariate geometry, dynamical systems and signal processing. SSA aims at decomposing the original series into a sum of a small number of interpretable components such as a slowly varying trend, oscillatory components and a ‘structureless’ noise. It is based on the singular value decomposition ( SVD ) of a specific matrix constructed upon the time series. Neither a parametric model nor stationarity-type conditions have to be assumed for the time series. This makes SSA a model-free method and hence enables SSA to have a very wide range of applicability.
For our purposes here, we are only concerned with the "Caterpillar" SSA . This methodology was developed in the former Soviet Union independently (the ‘iron curtain effect’) of the mainstream SSA . The main difference between the main-stream SSA and the "Caterpillar" SSA is not in the algorithmic details but rather in the assumptions and in the emphasis in the study of SSA properties. To apply the mainstream SSA , one often needs to assume some kind of stationarity of the time series and think in terms of the "signal plus noise" model (where the noise is often assumed to be ‘red’). In the "Caterpillar" SSA , the main methodological stress is on separability (of one component of the series from another one) and neither the assumption of stationarity nor the model in the form "signal plus noise" are required.
"Caterpillar" SSA
The basic "Caterpillar" SSA algorithm for analyzing one-dimensional time series consists of:
Transformation of the one-dimensional time series to the trajectory matrix by means of a delay procedure (this gives the name to the whole technique);
Singular Value Decomposition of the trajectory matrix;
Reconstruction of the original time series based on a number of selected eigenvectors.
This decomposition initializes forecasting procedures for both the original time series and its components. The method can be naturally extended to multidimensional time series and to image processing.
The method is a powerful and useful tool of time series analysis in meteorology, hydrology, geophysics, climatology and, according to our experience, in economics, biology, physics, medicine and other sciences; that is, where short and long, one-dimensional and multidimensional, stationary and non-stationary, almost deterministic and noisy time series are to be analyzed.
Included:
Bar coloring
Alerts
Signals
Loxx's Expanded Source Types
End-pointed SSA of Normalized Price Oscillator [Loxx]End-pointed SSA of Normalized Price Oscillator is an indicator that converts source price into a normalized oscillator and runs an SSA calculation to derived a smoother final output. This indicator also serves to introduce the concept of SSA to the Pine Coder community. The data returned from this algorithm is an array of modeled values on past X bars. We could use this data but it's not useful, so instead we use the end-pointed value which is the first value of the array at index 0.
What is Singular Spectrum Analysis (SSA)?
Singular spectrum analysis (SSA) is a technique of time series analysis and forecasting. It combines elements of classical time series analysis, multivariate statistics, multivariate geometry, dynamical systems and signal processing. SSA aims at decomposing the original series into a sum of a small number of interpretable components such as a slowly varying trend, oscillatory components and a ‘structureless’ noise. It is based on the singular value decomposition (SVD) of a specific matrix constructed upon the time series. Neither a parametric model nor stationarity-type conditions have to be assumed for the time series. This makes SSA a model-free method and hence enables SSA to have a very wide range of applicability.
For our purposes here, we are only concerned with the "Caterpillar" SSA. This methodology was developed in the former Soviet Union independently (the ‘iron curtain effect’) of the mainstream SSA. The main difference between the main-stream SSA and the "Caterpillar" SSA is not in the algorithmic details but rather in the assumptions and in the emphasis in the study of SSA properties. To apply the mainstream SSA, one often needs to assume some kind of stationarity of the time series and think in terms of the "signal plus noise" model (where the noise is often assumed to be ‘red’). In the "Caterpillar" SSA, the main methodological stress is on separability (of one component of the series from another one) and neither the assumption of stationarity nor the model in the form "signal plus noise" are required.
"Caterpillar" SSA
The basic "Caterpillar" SSA algorithm for analyzing one-dimensional time series consists of:
Transformation of the one-dimensional time series to the trajectory matrix by means of a delay procedure (this gives the name to the whole technique);
Singular Value Decomposition of the trajectory matrix;
Reconstruction of the original time series based on a number of selected eigenvectors.
This decomposition initializes forecasting procedures for both the original time series and its components. The method can be naturally extended to multidimensional time series and to image processing.
The method is a powerful and useful tool of time series analysis in meteorology, hydrology, geophysics, climatology and, according to our experience, in economics, biology, physics, medicine and other sciences; that is, where short and long, one-dimensional and multidimensional, stationary and non-stationary, almost deterministic and noisy time series are to be analyzed.
Included:
Bar coloring
Alerts
Signals
Loxx's Expanded Source Types
Itakura-Saito Autoregressive Extrapolation of Price [Loxx]Itakura-Saito Autoregressive Extrapolation of Price is an indicator that uses an autoregressive analysis to predict future prices. This is a linear technique that was originally derived or speech analysis algorithms.
What is Itakura-Saito Autoregressive Analysis?
The technique of linear prediction has been available for speech analysis since the late 1960s (Itakura & Saito, 1973a, 1970; Atal & Hanauer, 1971), although the basic principles were established long before this by Wiener (1947). Linear predictive coding, which is also known as autoregressive analysis, is a time-series algorithm that has applications in many fields other than speech analysis (see, e.g., Chatfield, 1989).
Itakura and Saito developed a formulation for linear prediction analysis using a lattice form for the inverse filter. The Itakura–Saito distance (or Itakura–Saito divergence) is a measure of the difference between an original spectrum and an approximation of that spectrum. Although it is not a perceptual measure it is intended to reflect perceptual (dis)similarity. It was proposed by Fumitada Itakura and Shuzo Saito in the 1960s while they were with NTT. The distance is defined as: The Itakura–Saito distance is a Bregman divergence, but is not a true metric since it is not symmetric and it does not fulfil triangle inequality.
read more: Selected Methods for Improving Synthesis Speech Quality Using Linear Predictive Coding: System Description, Coefficient Smoothing and Streak
Data inputs
Source Settings: -Loxx's Expanded Source Types. You typically use "open" since open has already closed on the current active bar
LastBar - bar where to start the prediction
PastBars - how many bars back to model
LPOrder - order of linear prediction model; 0 to 1
FutBars - how many bars you want to forward predict
Things to know
Normally, a simple moving average is calculated on source data. I've expanded this to 38 different averaging methods using Loxx's Moving Avreages.
This indicator repaints
Related Indicators (linear extrapolation of price)
Levinson-Durbin Autocorrelation Extrapolation of Price
Weighted Burg AR Spectral Estimate Extrapolation of Price
Helme-Nikias Weighted Burg AR-SE Extra. of Price
APA Adaptive Fisher Transform [Loxx]APA Adaptive Fisher Transform is an adaptive cycle Fisher Transform using Ehlers Autocorrelation Periodogram Algorithm to calculate the dominant cycle period.
What is an adaptive cycle, and what is Ehlers Autocorrelation Periodogram Algorithm?
From Ehlers' book Cycle Analytics for Traders Advanced Technical Trading Concepts by John F. Ehlers , 2013, page 135:
"Adaptive filters can have several different meanings. For example, Perry Kaufman’s adaptive moving average ( KAMA ) and Tushar Chande’s variable index dynamic average ( VIDYA ) adapt to changes in volatility . By definition, these filters are reactive to price changes, and therefore they close the barn door after the horse is gone.The adaptive filters discussed in this chapter are the familiar Stochastic , relative strength index ( RSI ), commodity channel index ( CCI ), and band-pass filter.The key parameter in each case is the look-back period used to calculate the indicator. This look-back period is commonly a fixed value. However, since the measured cycle period is changing, it makes sense to adapt these indicators to the measured cycle period. When tradable market cycles are observed, they tend to persist for a short while.Therefore, by tuning the indicators to the measure cycle period they are optimized for current conditions and can even have predictive characteristics.
The dominant cycle period is measured using the Autocorrelation Periodogram Algorithm. That dominant cycle dynamically sets the look-back period for the indicators. I employ my own streamlined computation for the indicators that provide smoother and easier to interpret outputs than traditional methods. Further, the indicator codes have been modified to remove the effects of spectral dilation.This basically creates a whole new set of indicators for your trading arsenal."
What is Fisher Transform?
The Fisher Transform is a technical indicator created by John F. Ehlers that converts prices into a Gaussian normal distribution.
The indicator highlights when prices have moved to an extreme, based on recent prices. This may help in spotting turning points in the price of an asset. It also helps show the trend and isolate the price waves within a trend.
Included:
Zero-line and signal cross options for bar coloring
Customizable overbought/oversold thresh-holds
Alerts
Signals
Jurik Filtered, Composite Fractal Behavior (CFB) Channels [Loxx]Double Jurik-Filtered Composite Fractal Behavior (CFB) Channels is a channel indicator that acts as both a baseline, similar to Donchian, and as support and resistance levels. This indicator is price time adaptive meaning it flexes to price volatility waves. The indicators adaptive nature is calculated using the Composite Fractal Behavior (CFB) algorithm. The result of this adaptive calculation is then smoothed using Jurik Filtering, and then it's normalized to conform to a range of values. This helps better identify trends.
What is Composite Fractal Behavior (CFB)?
All around you mechanisms adjust themselves to their environment. From simple thermostats that react to air temperature to computer chips in modern cars that respond to changes in engine temperature, r.p.m.'s, torque, and throttle position. It was only a matter of time before fast desktop computers applied the mathematics of self-adjustment to systems that trade the financial markets.
Unlike basic systems with fixed formulas, an adaptive system adjusts its own equations. For example, start with a basic channel breakout system that uses the highest closing price of the last N bars as a threshold for detecting breakouts on the up side. An adaptive and improved version of this system would adjust N according to market conditions, such as momentum, price volatility or acceleration.
Since many systems are based directly or indirectly on cycles, another useful measure of market condition is the periodic length of a price chart's dominant cycle, (DC), that cycle with the greatest influence on price action.
The utility of this new DC measure was noted by author Murray Ruggiero in the January '96 issue of Futures Magazine. In it. Mr. Ruggiero used it to adaptive adjust the value of N in a channel breakout system. He then simulated trading 15 years of D-Mark futures in order to compare its performance to a similar system that had a fixed optimal value of N. The adaptive version produced 20% more profit!
This DC index utilized the popular MESA algorithm (a formulation by John Ehlers adapted from Burg's maximum entropy algorithm, MEM). Unfortunately, the DC approach is problematic when the market has no real dominant cycle momentum, because the mathematics will produce a value whether or not one actually exists! Therefore, we developed a proprietary indicator that does not presuppose the presence of market cycles. It's called CFB (Composite Fractal Behavior) and it works well whether or not the market is cyclic.
CFB examines price action for a particular fractal pattern, categorizes them by size, and then outputs a composite fractal size index. This index is smooth, timely and accurate
Essentially, CFB reveals the length of the market's trending action time frame. Long trending activity produces a large CFB index and short choppy action produces a small index value. Investors have found many applications for CFB which involve scaling other existing technical indicators adaptively, on a bar-to-bar basis.
What is Jurik Volty used in the Juirk Filter?
One of the lesser known qualities of Juirk smoothing is that the Jurik smoothing process is adaptive. "Jurik Volty" (a sort of market volatility ) is what makes Jurik smoothing adaptive. The Jurik Volty calculation can be used as both a standalone indicator and to smooth other indicators that you wish to make adaptive.
What is the Jurik Moving Average?
Have you noticed how moving averages add some lag (delay) to your signals? ... especially when price gaps up or down in a big move, and you are waiting for your moving average to catch up? Wait no more! JMA eliminates this problem forever and gives you the best of both worlds: low lag and smooth lines.
Ideally, you would like a filtered signal to be both smooth and lag-free. Lag causes delays in your trades, and increasing lag in your indicators typically result in lower profits. In other words, late comers get what's left on the table after the feast has already begun.
Adaptive, Double Jurik Filter Moving Average (AJFMA) [Loxx]Adaptive, Double Jurik Filter Moving Average (AJFMA) is moving average like Jurik Moving Average but with the addition of double smoothing and adaptive length (Autocorrelation Periodogram Algorithm) and power/volatility {Juirk Volty) inputs to further reduce noise and identify trends.
What is Jurik Volty?
One of the lesser known qualities of Juirk smoothing is that the Jurik smoothing process is adaptive. "Jurik Volty" (a sort of market volatility ) is what makes Jurik smoothing adaptive. The Jurik Volty calculation can be used as both a standalone indicator and to smooth other indicators that you wish to make adaptive.
What is the Jurik Moving Average?
Have you noticed how moving averages add some lag (delay) to your signals? ... especially when price gaps up or down in a big move, and you are waiting for your moving average to catch up? Wait no more! JMA eliminates this problem forever and gives you the best of both worlds: low lag and smooth lines.
Ideally, you would like a filtered signal to be both smooth and lag-free. Lag causes delays in your trades, and increasing lag in your indicators typically result in lower profits. In other words, late comers get what's left on the table after the feast has already begun.
That's why investors, banks and institutions worldwide ask for the Jurik Research Moving Average ( JMA ). You may apply it just as you would any other popular moving average. However, JMA's improved timing and smoothness will astound you.
What is adaptive Jurik volatility?
One of the lesser known qualities of Juirk smoothing is that the Jurik smoothing process is adaptive. "Jurik Volty" (a sort of market volatility ) is what makes Jurik smoothing adaptive. The Jurik Volty calculation can be used as both a standalone indicator and to smooth other indicators that you wish to make adaptive.
What is an adaptive cycle, and what is Ehlers Autocorrelation Periodogram Algorithm?
From his Ehlers' book Cycle Analytics for Traders Advanced Technical Trading Concepts by John F. Ehlers , 2013, page 135:
"Adaptive filters can have several different meanings. For example, Perry Kaufman’s adaptive moving average ( KAMA ) and Tushar Chande’s variable index dynamic average ( VIDYA ) adapt to changes in volatility . By definition, these filters are reactive to price changes, and therefore they close the barn door after the horse is gone.The adaptive filters discussed in this chapter are the familiar Stochastic , relative strength index ( RSI ), commodity channel index ( CCI ), and band-pass filter.The key parameter in each case is the look-back period used to calculate the indicator. This look-back period is commonly a fixed value. However, since the measured cycle period is changing, it makes sense to adapt these indicators to the measured cycle period. When tradable market cycles are observed, they tend to persist for a short while.Therefore, by tuning the indicators to the measure cycle period they are optimized for current conditions and can even have predictive characteristics.
The dominant cycle period is measured using the Autocorrelation Periodogram Algorithm. That dominant cycle dynamically sets the look-back period for the indicators. I employ my own streamlined computation for the indicators that provide smoother and easier to interpret outputs than traditional methods. Further, the indicator codes have been modified to remove the effects of spectral dilation.This basically creates a whole new set of indicators for your trading arsenal."
Included
- Double calculation of AJFMA for even smoother results
Ehlers Adaptive Relative Strength Index (RSI) [Loxx]Ehlers Adaptive Relative Strength Index (RSI) is an implementation of RSI using Ehlers Autocorrelation Periodogram Algorithm to derive the length input for RSI. Other implementations of Ehers Adaptive RSI rely on the inferior Hilbert Transformer derive the dominant cycle.
In his book "Cycle Analytics for Traders Advanced Technical Trading Concepts", John F. Ehlers describes an implementation for Adaptive Relative Strength Index in order to solve for varying length inputs into the classic RSI equation.
What is an adaptive cycle, and what is the Autocorrelation Periodogram Algorithm?
From his Ehlers' book mentioned above, page 135:
"Adaptive filters can have several different meanings. For example, Perry Kaufman’s adaptive moving average (KAMA) and Tushar Chande’s variable index dynamic average (VIDYA) adapt to changes in volatility. By definition, these filters are reactive to price changes, and therefore they close the barn door after the horse is gone.The adaptive filters discussed in this chapter are the familiar Stochastic, relative strength index (RSI), commodity channel index (CCI), and band-pass filter.The key parameter in each case is the look-back period used to calculate the indicator.This look-back period is commonly a fixed value. However, since the measured cycle period is changing, as we have seen in previous chapters, it makes sense to adapt these indicators to the measured cycle period. When tradable market cycles are observed, they tend to persist for a short while.Therefore, by tuning the indicators to the measure cycle period they are optimized for current conditions and can even have predictive characteristics.
The dominant cycle period is measured using the autocorrelation periodogram algorithm. That dominant cycle dynamically sets the look-back period for the indicators. I employ my own streamlined computation for the indicators that provide smoother and easier to interpret outputs than traditional methods. Further, the indicator codes have been modified to remove the effects of spectral dilation.This basically creates a whole new set of indicators for your trading arsenal."
What is Adaptive RSI?
From his Ehlers' book mentioned above, page 137:
"The adaptive RSI starts with the computation of the dominant cycle using the autocorrelation periodogram approach. Since the objective is to use only those frequency components passed by the roofing filter, the variable "filt" is used as a data input rather than closing prices. Rather than independently taking the averages of the numerator and denominator, I chose to perform smoothing on the ratio using the SuperSmoother filter. The coefficients for the SuperSmoother filters have previously been computed in the dominant cycle measurement part of the code."
Happy trading!
Webhook Starter Kit [HullBuster]
Introduction
This is an open source strategy which provides a framework for webhook enabled projects. It is designed to work out-of-the-box on any instrument triggering on an intraday bar interval. This is a full featured script with an emphasis on actual trading at a brokerage through the TradingView alert mechanism and without requiring browser plugins.
The source code is written in a self documenting style with clearly defined sections. The sections “communicate” with each other through state variables making it easy for the strategy to evolve and improve. This is an excellent place for Pine Language beginners to start their strategy building journey. The script exhibits many Pine Language features which will certainly ad power to your script building abilities.
This script employs a basic trend follow strategy utilizing a forward pyramiding technique. Trend detection is implemented through the use of two higher time frame series. The market entry setup is a Simple Moving Average crossover. Positions exit by passing through conditional take profit logic. The script creates ten indicators including a Zscore oscillator to measure support and resistance levels. The indicator parameters are exposed through 47 strategy inputs segregated into seven sections. All of the inputs are equipped with detailed tool tips to help you get started.
To improve the transition from simulation to execution, strategy.entry and strategy.exit calls show enhanced message text with embedded keywords that are combined with the TradingView placeholders at alert time. Thereby, enabling a single JSON message to generate multiple execution events. This is genius stuff from the Pine Language development team. Really excellent work!
This document provides a sample alert message that can be applied to this script with relatively little modification. Without altering the code, the strategy inputs can alter the behavior to generate thousands of orders or simply a few dozen. It can be applied to crypto, stocks or forex instruments. A good way to look at this script is as a webhook lab that can aid in the development of your own endpoint processor, impress your co-workers and have hours of fun.
By no means is a webhook required or even necessary to benefit from this script. The setups, exits, trend detection, pyramids and DCA algorithms can be easily replaced with more sophisticated versions. The modular design of the script logic allows you to incrementally learn and advance this script into a functional trading system that you can be proud of.
Design
This is a trend following strategy that enters long above the trend line and short below. There are five trend lines that are visible by default but can be turned off in Section 7. Identified, in frequency order, as follows:
1. - EMA in the chart time frame. Intended to track price pressure. Configured in Section 3.
2. - ALMA in the higher time frame specified in Section 2 Signal Line Period.
3. - Linear Regression in the higher time frame specified in Section 2 Signal Line Period.
4. - Linear Regression in the higher time frame specified in Section 2 Signal Line Period.
5. - DEMA in the higher time frame specified in Section 2 Trend Line Period.
The Blue, Green and Orange lines are signal lines are on the same time frame. The time frame selected should be at least five times greater than the chart time frame. The Purple line represents the trend line for which prices above the line suggest a rising market and prices below a falling market. The time frame selected for the trend should be at least five times greater than the signal lines.
Three oscillators are created as follows:
1. Stochastic - In the chart time frame. Used to enter forward pyramids.
2. Stochastic - In the Trend period. Used to detect exit conditions.
3. Zscore - In the Signal period. Used to detect exit conditions.
The Stochastics are configured identically other than the time frame. The period is set in Section 2.
Two Simple Moving Averages provide the trade entry conditions in the form of a crossover. Crossing up is a long entry and down is a short. This is in fact the same setup you get when you select a basic strategy from the Pine editor. The crossovers are configured in Section 3. You can see where the crosses are occurring by enabling Show Entry Regions in Section 7.
The script has the capacity for pyramids and DCA. Forward pyramids are enabled by setting the Pyramid properties tab with a non zero value. In this case add on trades will enter the market on dips above the position open price. This process will continue until the trade exits. Downward pyramids are available in Crypto and Range mode only. In this case add on trades are placed below the entry price in the drawdown space until the stop is hit. To enable downward pyramids set the Pyramid Minimum Span In Section 1 to a non zero value.
This implementation of Dollar Cost Averaging (DCA) triggers off consecutive losses. Each loss in a run increments a sequence number. The position size is increased as a multiple of this sequence. When the position eventually closes at a profit the sequence is reset. DCA is enabled by setting the Maximum DCA Increments In Section 1 to a non zero value.
It should be noted that the pyramid and DCA features are implemented using a rudimentary design and as such do not perform with the precision of my invite only scripts. They are intended as a feature to stress test your webhook endpoint. As is, you will need to buttress the logic for it to be part of an automated trading system. It is for this reason that I did not apply a Martingale algorithm to this pyramid implementation. But, hey, it’s an open source script so there is plenty of room for learning and your own experimentation.
How does it work
The overall behavior of the script is governed by the Trading Mode selection in Section 1. It is the very first input so you should think about what behavior you intend for this strategy at the onset of the configuration. As previously discussed, this script is designed to be a trend follower. The trend being defined as where the purple line is predominately heading. In BiDir mode, SMA crossovers above the purple line will open long positions and crosses below the line will open short. If pyramiding is enabled add on trades will accumulate on dips above the entry price. The value applied to the Minimum Profit input in Section 1 establishes the threshold for a profitable exit. This is not a hard number exit. The conditional exit logic must be satisfied in order to permit the trade to close. This is where the effort put into the indicator calibration is realized. There are four ways the trade can exit at a profit:
1. Natural exit. When the blue line crosses the green line the trade will close. For a long position the blue line must cross under the green line (downward). For a short the blue must cross over the green (upward).
2. Alma / Linear Regression event. The distance the blue line is from the green and the relative speed the cross is experiencing determines this event. The activation thresholds are set in Section 6 and relies on the period and length set in Section 2. A long position will exit on an upward thrust which exceeds the activation threshold. A short will exit on a downward thrust.
3. Exponential event. The distance the yellow line is from the blue and the relative speed the cross is experiencing determines this event. The activation thresholds are set in Section 3 and relies on the period and length set in the same section.
4. Stochastic event. The purple line stochastic is used to measure overbought and over sold levels with regard to position exits. Signal line positions combined with a reading over 80 signals a long profit exit. Similarly, readings below 20 signal a short profit exit.
Another, optional, way to exit a position is by Bale Out. You can enable this feature in Section 1. This is a handy way to reduce the risk when carrying a large pyramid stack. Instead of waiting for the entire position to recover we exit early (bale out) as soon as the profit value has doubled.
There are lots of ways to implement a bale out but the method I used here provides a succinct example. Feel free to improve on it if you like. To see where the Bale Outs occur, enable Show Bale Outs in Section 7. Red labels are rendered below each exit point on the chart.
There are seven selectable Trading Modes available from the drop down in Section 1:
1. Long - Uses the strategy.risk.allow_entry_in to execute long only trades. You will still see shorts on the chart.
2. Short - Uses the strategy.risk.allow_entry_in to execute short only trades. You will still see long trades on the chart.
3. BiDir - This mode is for margin trading with a stop. If a long position was initiated above the trend line and the price has now fallen below the trend, the position will be reversed after the stop is hit. Forward pyramiding is available in this mode if you set the Pyramiding value in the Properties tab. DCA can also be activated.
4. Flip Flop - This is a bidirectional trading mode that automatically reverses on a trend line crossover. This is distinctively different from BiDir since you will get a reversal even without a stop which is advantageous in non-margin trading.
5. Crypto - This mode is for crypto trading where you are buying the coins outright. In this case you likely want to accumulate coins on a crash. Especially, when all the news outlets are talking about the end of Bitcoin and you see nice deep valleys on the chart. Certainly, under these conditions, the market will be well below the purple line. No margin so you can’t go short. Downward pyramids are enabled for Crypto mode when two conditions are met. First the Pyramiding value in the Properties tab must be non zero. Second the Pyramid Minimum Span in Section 1 must be non zero.
6. Range - This is a counter trend trading mode. Longs are entered below the purple trend line and shorts above. Useful when you want to test your webhook in a market where the trend line is bisecting the signal line series. Remember that this strategy is a trend follower. It’s going to get chopped out in a range bound market. By turning on the Range mode you will at least see profitable trades while stuck in the range. However, when the market eventually picks a direction, this mode will sustain losses. This range trading mode is a rudimentary implementation that will need a lot of improvement if you want to create a reliable switch hitter (trend/range combo).
7. No Trade. Useful when setting up the trend lines and the entry and exit is not important.
Once in the trade, long or short, the script tests the exit condition on every bar. If not a profitable exit then it checks if a pyramid is required. As mentioned earlier, the entry setups are quite primitive. Although they can easily be replaced by more sophisticated algorithms, what I really wanted to show is the diminished role of the position entry in the overall life of the trade. Professional traders spend much more time on the management of the trade beyond the market entry. While your trade entry is important, you can get in almost anywhere and still land a profitable exit.
If DCA is enabled, the size of the position will increase in response to consecutive losses. The number of times the position can increase is limited by the number set in Maximum DCA Increments of Section 1. Once the position breaks the losing streak the trade size will return the default quantity set in the Properties tab. It should be noted that the Initial Capital amount set in the Properties tab does not affect the simulation in the same way as a real account. In reality, running out of money will certainly halt trading. In fact, your account would be frozen long before the last penny was committed to a trade. On the other hand, TradingView will keep running the simulation until the current bar even if your funds have been technically depleted.
Entry and exit use the strategy.entry and strategy.exit calls respectfully. The alert_message parameter has special keywords that the endpoint expects to properly calculate position size and message sequence. The alert message will embed these keywords in the JSON object through the {{strategy.order.alert_message}} placeholder. You should use whatever keywords are expected from the endpoint you intend to webhook in to.
Webhook Integration
The TradingView alerts dialog provides a way to connect your script to an external system which could actually execute your trade. This is a fantastic feature that enables you to separate the data feed and technical analysis from the execution and reporting systems. Using this feature it is possible to create a fully automated trading system entirely on the cloud. Of course, there is some work to get it all going in a reliable fashion. Being a strategy type script place holders such as {{strategy.position_size}} can be embedded in the alert message text. There are more than 10 variables which can write internal script values into the message for delivery to the specified endpoint.
Entry and exit use the strategy.entry and strategy.exit calls respectfully. The alert_message parameter has special keywords that my endpoint expects to properly calculate position size and message sequence. The alert message will embed these keywords in the JSON object through the {{strategy.order.alert_message}} placeholder. You should use whatever keywords are expected from the endpoint you intend to webhook in to.
Here is an excerpt of the fields I use in my webhook signal:
"broker_id": "kraken",
"account_id": "XXX XXXX XXXX XXXX",
"symbol_id": "XMRUSD",
"action": "{{strategy.order.action}}",
"strategy": "{{strategy.order.id}}",
"lots": "{{strategy.order.contracts}}",
"price": "{{strategy.order.price}}",
"comment": "{{strategy.order.alert_message}}",
"timestamp": "{{time}}"
Though TradingView does a great job in dispatching your alert this feature does come with a few idiosyncrasies. Namely, a single transaction call in your script may cause multiple transmissions to the endpoint. If you are using placeholders each message describes part of the transaction sequence. A good example is closing a pyramid stack. Although the script makes a single strategy.close() call, the endpoint actually receives a close message for each pyramid trade. The broker, on the other hand, only requires a single close. The incongruity of this situation is exacerbated by the possibility of messages being received out of sequence. Depending on the type of order designated in the message, a close or a reversal. This could have a disastrous effect on your live account. This broker simulator has no idea what is actually going on at your real account. Its just doing the job of running the simulation and sending out the computed results. If your TradingView simulation falls out of alignment with the actual trading account lots of really bad things could happen. Like your script thinks your are currently long but the account is actually short. Reversals from this point forward will always be wrong with no one the wiser. Human intervention will be required to restore congruence. But how does anyone find out this is occurring? In closed systems engineering this is known as entropy. In practice your webhook logic should be robust enough to detect these conditions. Be generous with the placeholder usage and give the webhook code plenty of information to compare states. Both issuer and receiver. Don’t blindly commit incoming signals without verifying system integrity.
Setup
The following steps provide a very brief set of instructions that will get you started on your first configuration. After you’ve gone through the process a couple of times, you won’t need these anymore. It’s really a simple script after all. I have several example configurations that I used to create the performance charts shown. I can share them with you if you like. Of course, if you’ve modified the code then these steps are probably obsolete.
There are 47 inputs divided into seven sections. For the most part, the configuration process is designed to flow from top to bottom. Handy, tool tips are available on every field to help get you through the initial setup.
Step 1. Input the Base Currency and Order Size in the Properties tab. Set the Pyramiding value to zero.
Step 2. Select the Trading Mode you intend to test with from the drop down in Section 1. I usually select No Trade until I’ve setup all of the trend lines, profit and stop levels.
Step 3. Put in your Minimum Profit and Stop Loss in the first section. This is in pips or currency basis points (chart right side scale). Remember that the profit is taken as a conditional exit not a fixed limit. The actual profit taken will almost always be greater than the amount specified. The stop loss, on the other hand, is indeed a hard number which is executed by the TradingView broker simulator when the threshold is breached.
Step 4. Apply the appropriate value to the Tick Scalar field in Section 1. This value is used to remove the pipette from the price. You can enable the Summary Report in Section 7 to see the TradingView minimum tick size of the current chart.
Step 5. Apply the appropriate Price Normalizer value in Section 1. This value is used to normalize the instrument price for differential calculations. Basically, we want to increase the magnitude to significant digits to make the numbers more meaningful in comparisons. Though I have used many normalization techniques, I have always found this method to provide a simple and lightweight solution for less demanding applications. Most of the time the default value will be sufficient. The Tick Scalar and Price Normalizer value work together within a single calculation so changing either will affect all delta result values.
Step 6. Turn on the trend line plots in Section 7. Then configure Section 2. Try to get the plots to show you what’s really happening not what you want to happen. The most important is the purple trend line. Select an interval and length that seem to identify where prices tend to go during non-consolidation periods. Remember that a natural exit is when the blue crosses the green line.
Step 7. Enable Show Event Regions in Section 7. Then adjust Section 6. Blue background fills are spikes and red fills are plunging prices. These measurements should be hard to come by so you should see relatively few fills on the chart if you’ve set this up as intended. Section 6 includes the Zscore oscillator the state of which combines with the signal lines to detect statistically significant price movement. The Zscore is a zero based calculation with positive and negative magnitude readings. You want to input a reasonably large number slightly below the maximum amplitude seen on the chart. Both rise and fall inputs are entered as a positive real number. You can easily use my code to create a separate indicator if you want to see it in action. The default value is sufficient for most configurations.
Step 8. Turn off Show Event Regions and enable Show Entry Regions in Section 7. Then adjust Section 3. This section contains two parts. The entry setup crossovers and EMA events. Adjust the crossovers first. That is the Fast Cross Length and Slow Cross Length. The frequency of your trades will be shown as blue and red fills. There should be a lot. Then turn off Show Event Regions and enable Display EMA Peaks. Adjust all the fields that have the word EMA. This is actually the yellow line on the chart. The blue and red fills should show much less than the crossovers but more than event fills shown in Step 7.
Step 9. Change the Trading Mode to BiDir if you selected No Trades previously. Look on the chart and see where the trades are occurring. Make adjustments to the Minimum Profit and Stop Offset in Section 1 if necessary. Wider profits and stops reduce the trade frequency.
Step 10. Go to Section 4 and 5 and make fine tuning adjustments to the long and short side.
Example Settings
To reproduce the performance shown on the chart please use the following configuration: (Bitcoin on the Kraken exchange)
1. Select XBTUSD Kraken as the chart symbol.
2. On the properties tab set the Order Size to: 0.01 Bitcoin
3. On the properties tab set the Pyramiding to: 12
4. In Section 1: Select “Crypto” for the Trading Model
5. In Section 1: Input 2000 for the Minimum Profit
6. In Section 1: Input 0 for the Stop Offset (No Stop)
7. In Section 1: Input 10 for the Tick Scalar
8. In Section 1: Input 1000 for the Price Normalizer
9. In Section 1: Input 2000 for the Pyramid Minimum Span
10. In Section 1: Check mark the Position Bale Out
11. In Section 2: Input 60 for the Signal Line Period
12. In Section 2: Input 1440 for the Trend Line Period
13. In Section 2: Input 5 for the Fast Alma Length
14. In Section 2: Input 22 for the Fast LinReg Length
15. In Section 2: Input 100 for the Slow LinReg Length
16. In Section 2: Input 90 for the Trend Line Length
17. In Section 2: Input 14 Stochastic Length
18. In Section 3: Input 9 Fast Cross Length
19. In Section 3: Input 24 Slow Cross Length
20. In Section 3: Input 8 Fast EMA Length
21. In Section 3: Input 10 Fast EMA Rise NetChg
22. In Section 3: Input 1 Fast EMA Rise ROC
23. In Section 3: Input 10 Fast EMA Fall NetChg
24. In Section 3: Input 1 Fast EMA Fall ROC
25. In Section 4: Check mark the Long Natural Exit
26. In Section 4: Check mark the Long Signal Exit
27. In Section 4: Check mark the Long Price Event Exit
28. In Section 4: Check mark the Long Stochastic Exit
29. In Section 5: Check mark the Short Natural Exit
30. In Section 5: Check mark the Short Signal Exit
31. In Section 5: Check mark the Short Price Event Exit
32. In Section 5: Check mark the Short Stochastic Exit
33. In Section 6: Input 120 Rise Event NetChg
34. In Section 6: Input 1 Rise Event ROC
35. In Section 6: Input 5 Min Above Zero ZScore
36. In Section 6: Input 120 Fall Event NetChg
37. In Section 6: Input 1 Fall Event ROC
38. In Section 6: Input 5 Min Below Zero ZScore
In this configuration we are trading in long only mode and have enabled downward pyramiding. The purple trend line is based on the day (1440) period. The length is set at 90 days so it’s going to take a while for the trend line to alter course should this symbol decide to node dive for a prolonged amount of time. Your trades will still go long under those circumstances. Since downward accumulation is enabled, your position size will grow on the way down.
The performance example is Bitcoin so we assume the trader is buying coins outright. That being the case we don’t need a stop since we will never receive a margin call. New buy signals will be generated when the price exceeds the magnitude and speed defined by the Event Net Change and Rate of Change.
Feel free to PM me with any questions related to this script. Thank you and happy trading!
CFTC RULE 4.41
These results are based on simulated or hypothetical performance results that have certain inherent limitations. Unlike the results shown in an actual performance record, these results do not represent actual trading. Also, because these trades have not actually been executed, these results may have under-or over-compensated for the impact, if any, of certain market factors, such as lack of liquidity. Simulated or hypothetical trading programs in general are also subject to the fact that they are designed with the benefit of hindsight. No representation is being made that any account will or is likely to achieve profits or losses similar to these being shown.
Auto Phivots S/R [DM]Greetings colleagues
Today I share the classic pivot points indicator
Added options:
Standard levels
Fibonacci levels "up to 261'8"
Logarithmic scale option
//Pivot Points Standard
//Pivot Points Standard — is a technical indicator that is used to determine the levels
//at which price may face support or resistance. The Pivot Points indicator consists of
//a pivot point (PP) level and several support (S) and resistance (R) levels.
//
//Calculation
//PP, resistance and support values are calculated in different ways, depending on
//the type of the indicator, specified by the Type field in indicator inputs. To
//calculate PP and support/resistance levels, the values OPENcurr, OPENprev, HIGHprev,
//LOWprev, CLOSEprev are used, which are the values of the current open and previous
//open, high, low and close, respectively, on the indicator resolution. The indicator
//resolution is set by the input of the Pivots Timeframe. If the Pivots Timeframe is set
//to AUTO (the default value), then the increased resolution is determined by the
//following algorithm:
//
//for intraday resolutions up to and including 15 min, DAY (1D) is used
//for intraday resolutions more than 15 min, WEEK (1W) is used
//for daily resolutions MONTH is used (1M)
//for weekly and monthly resolutions, 12-MONTH (12M) is used
iBagging Multi-IndicatorHello traders!
You know, machine learning is a very popular theme nowadays. The best tricks and methods were borrowed from Math and Computer Science to improve and create ML algorithms. As you know, one of our analysts is a great fan of ML, thus he decided to borrow on very powerful method from ML.
We have taken 5 indicators, tuned them a bit and make them to vote. If the number of voters is more than the threshold the the bullish/bearish signal shows. It's called Bagging, when some algorithms are voting to classify or to regress. We use EMA Cross with NATR filter, BB Width, divergency detector and bull bear power. This bundle in my opinion is one of the best to define entries. Check it up in you daily trading staff. You shouldn't forget about tuning the parameters on different coins and timeframes. We checked it on 1H BTCUSDT and default parameters are for this combination. I hope you'll enjoy my masterpiece.
How to use it?
Just add it to the chart and check up signals.
Trend Follow SystemTrend following algorithm:
We take 1- 5 Fibonacci Ema values. 21, 34, 55, 89, 144
2- We normalize the changes of these values over time between 1-100.
3- We take the ema value of 1 length so that it does not follow a horizontal course after the normalization process.
4- In order not to experience too much change, we take the value of sma with a length of 5.
5-We think that when all values are 100, the trend is up, when all values are 0, the trend is down, otherwise the trend is horizontal.
[blackcat] L2 Ehlers Adaptive Jon Andersen R-Squared IndicatorLevel: 2
Background
@pips_v1 has proposed an interesting idea that is it possible to code an "Adaptive Jon Andersen R-Squared Indicator" where the length is determined by DCPeriod as calculated in Ehlers Sine Wave Indicator? I agree with him and starting to construct this indicator. After a study, I found "(blackcat) L2 Ehlers Autocorrelation Periodogram" script could be reused for this purpose because Ehlers Autocorrelation Periodogram is an ideal candidate to calculate the dominant cycle. On the other hand, there are two inputs for R-Squared indicator:
Length - number of bars to calculate moment correlation coefficient R
AvgLen - number of bars to calculate average R-square
I used Ehlers Autocorrelation Periodogram to produced a dynamic value of "Length" of R-Squared indicator and make it adaptive.
Function
One tool available in forecasting the trendiness of the breakout is the coefficient of determination (R-squared), a statistical measurement. The R-squared indicates linear strength between the security's price (the Y - axis) and time (the X - axis). The R-squared is the percentage of squared error that the linear regression can eliminate if it were used as the predictor instead of the mean value. If the R-squared were 0.99, then the linear regression would eliminate 99% of the error for prediction versus predicting closing prices using a simple moving average.
When the R-squared is at an extreme low, indicating that the mean is a better predictor than regression, it can only increase, indicating that the regression is becoming a better predictor than the mean. The opposite is true for extreme high values of the R-squared.
To make this indicator adaptive, the dominant cycle is extracted from the spectral estimate in the next block of code using a center-of-gravity ( CG ) algorithm. The CG algorithm measures the average center of two-dimensional objects. The algorithm computes the average period at which the powers are centered. That is the dominant cycle. The dominant cycle is a value that varies with time. The spectrum values vary between 0 and 1 after being normalized. These values are converted to colors. When the spectrum is greater than 0.5, the colors combine red and yellow, with yellow being the result when spectrum = 1 and red being the result when the spectrum = 0.5. When the spectrum is less than 0.5, the red saturation is decreased, with the result the color is black when spectrum = 0.
Construction of the autocorrelation periodogram starts with the autocorrelation function using the minimum three bars of averaging. The cyclic information is extracted using a discrete Fourier transform (DFT) of the autocorrelation results. This approach has at least four distinct advantages over other spectral estimation techniques. These are:
1. Rapid response. The spectral estimates start to form within a half-cycle period of their initiation.
2. Relative cyclic power as a function of time is estimated. The autocorrelation at all cycle periods can be low if there are no cycles present, for example, during a trend. Previous works treated the maximum cycle amplitude at each time bar equally.
3. The autocorrelation is constrained to be between minus one and plus one regardless of the period of the measured cycle period. This obviates the need to compensate for Spectral Dilation of the cycle amplitude as a function of the cycle period.
4. The resolution of the cyclic measurement is inherently high and is independent of any windowing function of the price data.
Key Signal
DC --> Ehlers dominant cycle.
AvgSqrR --> R-squared output of the indicator.
Remarks
This is a Level 2 free and open source indicator.
Feedbacks are appreciated.
Fibonacci Extension / Retracement / Pivot Points by DGTFɪʙᴏɴᴀᴄᴄɪ Exᴛᴇɴᴛɪᴏɴ / Rᴇᴛʀᴀᴄᴍᴇɴᴛ / Pɪᴠᴏᴛ Pᴏɪɴᴛꜱ
This study combines various Fibonacci concepts into one, and some basic volume and volatility indications
█ Pɪᴠᴏᴛ Pᴏɪɴᴛꜱ — is a technical indicator that is used to determine the levels at which price may face support or resistance. The Pivot Points indicator consists of a pivot point (PP) level and several support (S) and resistance (R) levels. PP, resistance and support values are calculated in different ways, depending on the type of the indicator, this study implements Fibonacci Pivot Points
The indicator resolution is set by the input of the Pivot Points TF (Timeframe). If the Pivot Points TF is set to AUTO (the default value), then the increased resolution is determined by the following algorithm:
for intraday resolutions up to and including 5 min, 4HOURS (4H) is used
for intraday resolutions more than 5 min and up to and including 45 min, DAY (1D) is used
for intraday resolutions more than 45 min and up to and including 4 hour, WEEK (1W) is used
for daily resolutions MONTH is used (1M)
for weekly resolutions, 3-MONTH (3M) is used
for monthly resolutions, 12-MONTH (12M) is used
If the Pivot Points TF is set to User Defined, users may choose any higher timeframe of their preference
█ Fɪʙ Rᴇᴛʀᴀᴄᴇᴍᴇɴᴛ — Fibonacci retracements is a popular instrument used by technical analysts to determine support and resistance areas. In technical analysis, this tool is created by taking two extreme points (usually a peak and a trough) on the chart and dividing the vertical distance by the key Fibonacci coefficients equal to 23.6%, 38.2%, 50%, 61.8%, and 100%. This study implements an automated method of identifying the pivot lows/highs and automatically draws horizontal lines that are used to determine possible support and resistance levels
█ Fɪʙᴏɴᴀᴄᴄɪ Exᴛᴇɴꜱɪᴏɴꜱ — Fibonacci extensions are a tool that traders can use to establish profit targets or estimate how far a price may travel AFTER a retracement/pullback is finished. Extension levels are also possible areas where the price may reverse. This study implements an automated method of identifying the pivot lows/highs and automatically draws horizontal lines that are used to determine possible support and resistance levels.
IMPORTANT NOTE: Fibonacci extensions option may require to do further adjustment of the study parameters for proper usage. Extensions are aimed to be used when a trend is present and they aim to measure how far a price may travel AFTER a retracement/pullback. I will strongly suggest users of this study to check the education post for further details, where to use extensions and where to use retracements
Important input options for both Fibonacci Extensions and Retracements
Deviation, is a multiplier that affects how much the price should deviate from the previous pivot in order for the bar to become a new pivot. Increasing its value is one way to get higher timeframe Fib Retracement Levels
Depth, affects the minimum number of bars that will be taken into account when building
█ Volume / Volatility Add-Ons
High Volatile Bar Indication
Volume Spike Bar Indication
Volume Weighted Colored Bars
This study benefits from build-in auto fib retracement tv study and modifications applied to get extentions and also to fit this combo
Disclaimer:
Trading success is all about following your trading strategy and the indicators should fit within your trading strategy, and not to be traded upon solely
The script is for informational and educational purposes only. Use of the script does not constitute professional and/or financial advice. You alone have the sole responsibility of evaluating the script output and risks associated with the use of the script. In exchange for using the script, you agree not to hold dgtrd TradingView user liable for any possible claim for damages arising from any decision you make based on use of the script
(IK) Base Break BuyThis strategy first calculates areas of support (bases), and then enters trades if that support is broken. The idea is to profit off of retracement. Dollar-cost-averaging safety orders are key here. This strategy takes into account a .1% commission, and tests are done with an initial capital of 100.00 USD. This only goes long.
The strategy is highly customizable. I've set the default values to suit ETH/USD 15m. If you're trading this on another ticker or timeframe, make sure to play around with the settings. There is an explanation of each input in the script comments. I found this to be profitable across most 'common sense' values for settings, but tweaking led to some pretty promising results. I leaned more towards high risk/high trade volume.
Always remember though: historical performance is no guarantee of future behavior . Keep settings within your personal risk tolerance, even if it promises better profit. Anyone can write a 100% profitable script if they assume price always eventually goes up.
Check the script comments for more details, but, briefly, you can customize:
-How many bases to keep track of at once
-How those bases are calculated
-What defines a 'base break'
-Order amounts
-Safety order count
-Stop loss
Here's the basic algorithm:
-Identify support.
--Have previous candles found bottoms in the same area of the current candle bottom?
--Is this support unique enough from other areas of support?
-Determine if support is broken.
--Has the price crossed under support quickly and with certainty?
-Enter trade with a percentage of initial capital.
-Execute safety orders if price continues to drop.
-Exit trade at profit target or stop loss.
Take profit is dynamic and calculated on order entry. The bigger the 'break', the higher your take profit percentage. This target percentage is based on average position size, so as safety orders are filled, and average position size comes down, the target profit becomes easier to reach.
Stop loss can be calculated one of two ways, either a static level based on initial entry, or a dynamic level based on average position size. If you use the latter (default), be aware, your real losses will be greater than your stated stop loss percentage . For example:
-stop loss = 15%, capital = 100.00, safety order threshold = 10%
-you buy $50 worth of shares at $1 - price average is $1
-you safety $25 worth of shares at $0.9 - price average is $0.966
-you safety $25 worth of shares at $0.8. - price average is $0.925
-you get stopped out at 0.925 * (1-.15) = $0.78625, and you're left with $78.62.
This is a realized loss of ~21.4% with a stop loss set to 15%. The larger your safety order threshold, the larger your real loss in comparison to your stop loss percentage, and vice versa.
Indicator plots show the calculated bases in white. The closest base below price is yellow. If that base is broken, it turns purple. Once a trade is entered, profit target is shown in silver and stop loss in red.