entropy#
API documentation for pandas_ta.statistics.entropy Python function.
- entropy(close, length=None, base=None, offset=None, **kwargs)[source]#
Entropy (ENTP)
Introduced by Claude Shannon in 1948, entropy measures the unpredictability of the data, or equivalently, of its average information. A die has higher entropy (p=1/6) versus a coin (p=1/2).
- Sources:
- Calculation:
- Default Inputs:
length=10, base=2
P = close / SUM(close, length) E = SUM(-P * npLog(P) / npLog(base), length)
- Args:
close (pd.Series): Series of ‘close’s length (int): It’s period. Default: 10 base (float): Logarithmic Base. Default: 2 offset (int): How many periods to offset the result. Default: 0
- Kwargs:
fillna (value, optional): pd.DataFrame.fillna(value) fill_method (value, optional): Type of fill method
- Returns:
pd.Series: New feature generated.