Market Crash Indicators

 

                                                                      generated by meta.ai

Here’s the quick, practical list — the stuff people actually watch to sense a crash before it hits. No guarantees, but these are the common warning lights:

• Market Breadth Collapse
Fewer stocks making new highs while many quietly start making new lows.
(Think: the “engine” is losing power even if the car is still moving.)

• Volatility Index (VIX) Spikes
A sudden and persistent rise in the VIX usually signals fear building up.

• Yield Curve Inversion
When short-term rates become higher than long-term rates — a classic recession/correction predictor.

• Credit Spreads Widening
Corporate bond yields rising faster than government bonds → stress in the system.

• Moving Average Crossovers
Price falls below the 200-day MA, or the 50-day crosses below the 200-day (death cross).

• RSI/Momentum Divergence
Price hits new highs but RSI doesn’t → momentum weakening.

• High Leverage / Margin Debt Rising
Too much borrowed money in the market makes crashes sharper; spikes in margin debt often come before corrections.

• Liquidity Drying Up
Lower trading volume and wider bid-ask spreads — early signs of stress.

• Bubble Indicators (simple math models)
• CAPE ratio extremely above historical norms
• Buffet Indicator (market cap vs GDP) overheating
• Log-periodic power law (LPPL) patterns — used to detect bubble tops

• Machine-learning early-warning signals (more advanced)
• Regime shift detection (Hidden Markov Models)
• Change-point detection
• Clustering anomalies in volatility/returns

Market Breadth Collapse
What it means: Fewer stocks rising, more stocks quietly breaking down.
How to compute:
• Count # of stocks above 200-day MA vs # below.
• Compute Advance/Decline Line (A/D):
 A/D = cumulative(sum of (advancers − decliners)).
A falling A/D while the index rises = danger.


VIX Spikes (Volatility Index)
What it means: Fear levels rising under the surface.
How to compute:
VIX is based on implied volatility of S&P 500 options.
You don’t calculate it manually — but you watch for:
• VIX > 20 (mild fear)
• VIX > 30 (serious stress)
• VIX rising while market rising = hidden danger


Yield Curve Inversion
What it means: Bond market expecting recession.
How to compute:
Take 10-year Treasury yield − 2-year Treasury yield.
If result < 0 → inversion → early warning.


Credit Spreads Widening
What it means: Investors demanding more return for risky debt.
How to compute:
Credit Spread = Corporate Bond Yield (AAA, BBB, junk) − US Treasury Yield
If spreads rise fast → risk increasing under the hood.


Moving Average Crossovers
What it means: Trend turning negative.
How to compute:
• 50-day MA = average(close prices of last 50 days)
• 200-day MA = average(close prices of last 200 days)
Crash warning:
• Price < 200-day MA
• 50-day MA crosses below 200-day MA (death cross)


RSI/Momentum Divergence
What it means: Price makes new highs but strength does not.
How to compute RSI:
RSI = 100 − (100 / (1 + RS))
RS = (avg gain over 14 periods) / (avg loss over 14 periods)
Bearish divergence =
• Price ↑ new highs
• RSI ↓ lower highs


High Margin Debt
What it means: Market overloaded with borrowed money.
How to compute:
No formula — just watch margin debt month-to-month.
If margin debt hits record highs + starts falling → forced selling risk.


Liquidity Drying Up
What it means: Market becomes fragile.
How to compute:
• Falling trading volume
• Wider bid-ask spreads
• Depth of order book decreases
No complex formula — just monitor trend.


CAPE Ratio (Cyclically Adjusted PE)
What it means: Long-term valuation overheating.
How to compute:
CAPE = Price / (10-year average real earnings)
If CAPE way above historical norm (25+) → bubble territory.


Buffett Indicator
What it means: Market size too large relative to economy.
How to compute:
Buffett Indicator = (Total Stock Market Cap / GDP) × 100
• < 80% = cheap
• 100–150% = expensive
• > 150% = overheated


Log-Periodic Power Law (LPPL)
What it means: Mathematical bubble detection model.
How to compute:
Models price as accelerating growth + oscillations:
P(t) = A + B(tᶜ) + C(tᶜ)cos(ω ln t + φ)
If fit shows super-exponential growth with oscillations → bubble peak forming.
(Used mostly in quant research.)


Regime Shift Detection (Hidden Markov Models)
What it means: Market switching from normal → stressed regime.
How to compute:
Train HMM on returns or volatility clusters.
If probability of “high-vol regime” jumps → early warning.


Change-Point Detection
What it means: Sudden shift in statistical behavior.
How to compute:
Use algorithms like:
• CUSUM
• BOCPD (Bayesian)
They monitor mean/variance shifts in returns.
A detected jump often precedes corrections.


Anomaly Clustering in Volatility
What it means: Strange patterns before crashes.
How to compute:
Use:
• k-means
• DBSCAN
• Isolation Forest
on windows of volatility, volume, returns.
Cluster drift = structural instability.


📉 Market Breadth Collapse Analysis Notebook

This notebook provides the foundational code and concepts to analyze a Market Breadth Collapse, a situation where a major market index rises, but the participation (or "breadth") of that rise is poor, indicating weakness beneath the surface.


1. Understanding Market Breadth Collapse

Meaning: A market breadth collapse occurs when a major stock index (e.g., S&P 500, Nasdaq) continues to move higher, but the internal participation is weakening. This means that fewer and fewer stocks are driving the index's gains, while the majority of stocks may be stagnant or even declining.

Danger Sign: The primary danger signal is when the Advance/Decline (A/D) Line is falling or flat while the major index is setting new highs. This divergence suggests that the index's performance is being propped up by a small number of large-cap stocks (like the "Magnificent Seven"), masking underlying weakness in the broader market.


2. Setup and Data Acquisition

First, we need to import the necessary libraries and fetch historical data. We'll use the S&P 500 ETF (SPY) as the index proxy for simplicity, but we also need data for all component stocks to compute breadth indicators.

Python
import yfinance as yf
import pandas as pd
import numpy as np
import matplotlib.pyplot as plt

# --- Data Acquisition (Conceptual/Placeholder) ---
# NOTE: To run this for real, you would need a reliable data source 
# for ALL S&P 500 components' daily prices and trading volumes.
# yfinance does not easily provide the full historical component list.

# 1. Fetch the Index Data (e.g., SPY)
ticker = "SPY" 
index_data = yf.download(ticker, start="2022-01-01", end="2025-01-01") 
print(f"Index Data ({ticker}) Head:")
print(index_data.head())

# 2. Conceptual Data Structure for Component Stocks
# In a real scenario, you would load a DataFrame like this:
# component_prices = pd.DataFrame(
#     data={
#         'AAPL': [... daily closing prices ...],
#         'MSFT': [... daily closing prices ...],
#         # ... for all 500 stocks
#     },
#     index=index_data.index # Same dates as the index
# )
# For this demonstration, we'll use placeholder logic.

3. Indicator 1: Stocks Above/Below the 200-Day Moving Average

This indicator measures the percentage of stocks in an index that are trading above their 200-day Simple Moving Average (SMA). It's a key measure of the long-term trend health of the overall market.

💡 Interpretation

  • Healthy Bull Market: $>70\%$ of stocks above the 200-day MA.

  • Breadth Weakening: The percentage drops significantly (e.g., from $80\%$ to $55\%$) while the index remains high.

Python
## --- COMPUTATION (Conceptual Logic) ---

def compute_stocks_above_ma(component_prices, ma_period=200):
    """
    Conceptual function to compute the percentage of stocks 
    trading above their specified Moving Average.
    """
    results = {}
    for date in component_prices.index:
        count_above = 0
        total_stocks = 0
        
        # In a real loop, you'd calculate the MA for each stock up to 'date'
        # and check the current closing price against it.
        
        # Placeholder: Generate a random percentage for demonstration
        if date in index_data.index:
            np.random.seed(date.dayofyear) # Seed for semi-reproducible plot
            percentage_above = np.clip(np.random.normal(0.65, 0.1), 0.3, 0.9) * 100
            results[date] = percentage_above
            
    return pd.Series(results).reindex(index_data.index).fillna(method='ffill')

# Placeholder Execution
# Replace 'None' with actual component data in a real setup
breadth_ma_data = compute_stocks_above_ma(None) 
breadth_ma_data.name = 'Percent_Above_200MA'

## --- VISUALIZATION ---
fig, ax1 = plt.subplots(figsize=(12, 6))

# Plot the Index (SPY)
color = 'tab:blue'
ax1.set_xlabel('Date')
ax1.set_ylabel('Index Price (SPY)', color=color)
ax1.plot(index_data['Close'], color=color, label='SPY Close')
ax1.tick_params(axis='y', labelcolor=color)
ax1.grid(True, linestyle='--', alpha=0.6)

# Create a second axis for the Breadth Indicator
ax2 = ax1.twinx()  
color = 'tab:red'
ax2.set_ylabel('% Stocks Above 200MA', color=color)  
ax2.plot(breadth_ma_data, color=color, linestyle='-', alpha=0.7, label='% Above 200MA')
ax2.tick_params(axis='y', labelcolor=color)

# Add critical threshold line
ax2.axhline(50, color='gray', linestyle=':', linewidth=1)

plt.title('Market Breadth: SPY vs. % Stocks Above 200-Day MA')
fig.tight_layout()
plt.show()

4. Indicator 2: The Advance/Decline (A/D) Line

The A/D Line is a cumulative measure of market momentum and breadth. It tracks the net daily difference between the number of Advancing stocks and Declining stocks.

💡 Computation

  1. Net A/D: $Advancers - Decliners$ (for a given day).

  2. A/D Line: $\sum_{i=1}^{Today} (\text{Advancers}_i - \text{Decliners}_i)$

⚠️ The Danger Signal

The Market Breadth Collapse is visually confirmed when the Index Price (e.g., SPY) rises to a new high, but the A/D Line fails to rise or actively declines, forming a divergence.

Python
## --- COMPUTATION (Conceptual Logic) ---

def compute_ad_line(daily_advancers, daily_decliners):
    """
    Conceptual function to compute the cumulative Advance/Decline Line.
    
    Args:
        daily_advancers (pd.Series): Daily count of advancing stocks.
        daily_decliners (pd.Series): Daily count of declining stocks.
        
    Returns:
        pd.Series: The Advance/Decline Line.
    """
    # Placeholder: Generate synthetic A/D data for demonstration
    # Create a period of divergence: Index rises, A/D flattens/falls
    net_ad = pd.Series(np.random.normal(0, 100, len(index_data)))
    net_ad.index = index_data.index
    
    # Force a general rising trend then a flattening/falling trend for a divergence example
    net_ad.iloc[:int(len(net_ad)*0.7)] += 300 
    net_ad.iloc[int(len(net_ad)*0.7):] -= 100 

    ad_line = net_ad.cumsum()
    ad_line.name = 'AD_Line'
    return ad_line

# Placeholder Execution (requires real daily A/D counts in a live system)
# In a real setup, this would use component-level open/close data.
ad_line_data = compute_ad_line(None, None) 

## --- VISUALIZATION ---
fig, ax1 = plt.subplots(figsize=(12, 6))

# Plot the Index (SPY)
color = 'tab:green'
ax1.set_xlabel('Date')
ax1.set_ylabel('Index Price (SPY)', color=color)
ax1.plot(index_data['Close'], color=color, label='SPY Close')
ax1.tick_params(axis='y', labelcolor=color)

# Create a second axis for the A/D Line
ax2 = ax1.twinx()  
color = 'tab:red'
ax2.set_ylabel('Advance/Decline Line (Cumulative)', color=color)  
ax2.plot(ad_line_data, color=color, linestyle='-', label='A/D Line')
ax2.tick_params(axis='y', labelcolor=color)

# Highlight a potential divergence period (conceptual based on placeholder data)
start_date = ad_line_data.index[int(len(ad_line_data)*0.7)]
end_date = ad_line_data.index[-1]
ax1.axvspan(start_date, end_date, color='orange', alpha=0.15, label='Divergence Area')

plt.title('Market Breadth Collapse: SPY vs. Advance/Decline Line')
fig.tight_layout()
plt.show()

5. Summary and Next Steps

The goal of this analysis is to identify when the market's internal health (breadth) does not confirm the performance of the price index. A breadth collapse is a strong non-confirmation, often preceding a significant market correction or bear market.

IndicatorSignal of Collapse (Divergence)Interpretation
A/D LineIndex makes a new high, but A/D Line fails to.Momentum/participation is fading; gains driven by few stocks.
% Above 200MAIndex makes a new high, but this percentage is significantly lower than a previous peak.Fewer stocks are in a long-term uptrend.

That's a great addition! Incorporating the VIX (Volatility Index) provides a crucial layer of sentiment and fear analysis to your Market Breadth Collapse notebook.

Here is the new section to include in your notebook, focusing on VIX Spikes.


📈 5. Indicator 3: VIX Spikes (The Market Fear Gauge)

The VIX (often called the "Fear Index") measures the market's expectation of 30-day volatility based on S&P 500 index options. A rising VIX signals that investors are willing to pay higher prices for options to insure against or bet on large market moves, indicating fear building before a potential fall.

💡 Computation and Interpretation

We will use the historical closing price of the VIX index ($^VIX$) itself.

VIX LevelInterpretation (Risk Gauge)
VIX < 20Normal/Calm market conditions.
VIX > 20Early Stress. Signals increasing investor anxiety.
VIX > 30Heavy Stress. Indicates panic, often seen during sharp market corrections.

⚠️ The Danger Signal (Hidden Trouble)

The key signal for hidden trouble is VIX rising while the stock market (Index) is still rising. Normally, the VIX moves inversely to the stock market (VIX rises when the market falls). If the VIX starts trending up while the S&P 500 (or SPY) is still making new highs, it suggests underlying fear is creeping in, often preceding a sharp reversal.

Python
## --- COMPUTATION ---

# 1. Fetch the VIX Index Data
vix_ticker = "^VIX" 
vix_data = yf.download(vix_ticker, start="2022-01-01", end="2025-01-01") 
vix_data = vix_data.reindex(index_data.index, method='ffill') # Align indices

# Combine SPY and VIX data for correlation check
combined_data = index_data[['Close']].copy()
combined_data['VIX_Close'] = vix_data['Close']
combined_data = combined_data.dropna()

## --- VISUALIZATION ---
fig, ax1 = plt.subplots(figsize=(12, 6))

# Plot the Index (SPY)
color = 'tab:blue'
ax1.set_xlabel('Date')
ax1.set_ylabel('Index Price (SPY)', color=color)
ax1.plot(combined_data['Close'], color=color, label='SPY Close', linewidth=1.5)
ax1.tick_params(axis='y', labelcolor=color)
ax1.grid(True, linestyle='--', alpha=0.6)

# Create a second axis for the VIX
ax2 = ax1.twinx()  
color = 'tab:orange'
ax2.set_ylabel('VIX Close', color=color)  
ax2.plot(combined_data['VIX_Close'], color=color, linestyle='-', alpha=0.8, label='VIX Index')
ax2.tick_params(axis='y', labelcolor=color)

# Add critical VIX threshold lines
ax2.axhline(20, color='red', linestyle=':', linewidth=1, label='VIX > 20 (Early Stress)')
ax2.axhline(30, color='darkred', linestyle='--', linewidth=1.5, label='VIX > 30 (Heavy Stress)')

# Add a combined legend
lines, labels = ax1.get_legend_handles_labels()
lines2, labels2 = ax2.get_legend_handles_labels()
ax2.legend(lines + lines2, labels + labels2, loc='upper left')

plt.title('VIX Spikes: SPY vs. CBOE Volatility Index (VIX)')
fig.tight_layout()
plt.show()

🧭 Integrated Collapse Summary

You now have three powerful, non-price-based indicators for detecting market weakness:

  1. Market Breadth (A/D Line & % Above MA): Are gains broad-based, or driven by a few stocks?

  2. VIX Spikes (Fear Gauge): Is fear increasing even when the market looks calm?

  3. Divergence: Is the index price confirming or contradicting these internal measures?

That's an excellent final piece for a comprehensive market health notebook. Adding the Yield Curve Inversion brings in the crucial perspective of the bond market, which is often considered a leading indicator of economic recession.

Here is the new section to integrate into your analysis notebook.


🏛️ 6. Indicator 4: Yield Curve Inversion (Recession Warning)

The Yield Curve plots the interest rates (yields) of bonds with equal credit quality but varying maturity dates.1 Normally, longer-term bonds (like the 10-year Treasury) offer higher yields than shorter-term bonds (like the 2-year Treasury).2

Meaning: Bond Market Expects Recession

An Inversion occurs when shorter-term yields rise above longer-term yields.3 This inversion signals that bond investors expect a slowdown or recession in the future, which will force the Federal Reserve to cut short-term rates.4 This expectation of lower future rates pushes the long-term bond yields down, leading to the inversion.

💡 Computation

We compute the spread between the 10-Year and 2-Year Treasury yields.

$$\text{Spread} = 10\text{Y Yield} - 2\text{Y Yield}$$

⚠️ The Danger Signal

  • Danger: $\text{Spread} < 0$ (The curve is inverted).

  • The Inversion: When the spread drops below zero, it has historically been one of the most reliable predictors of a recession, typically preceding the start of the recession by 6 to 24 months.

Python
## --- COMPUTATION ---

# 1. Fetch the Treasury Yield Data (using daily bond indices as proxies for yields)
# Note: Real-world analysis uses fixed-maturity Treasury data (e.g., from FRED), 
# but yfinance provides a quick proxy for demonstration.

# 10Y Treasury Yield Ticker (Proxy)
ten_year_ticker = "^TNX" 
ten_year_data = yf.download(ten_year_ticker, start="2022-01-01", end="2025-01-01")['Close']

# 2Y Treasury Yield Ticker (Proxy)
two_year_ticker = "^IRX"  # Using 13-week (3-month) as an available short-term proxy, 
                         # though 2-Year data is ideal (often fetched from FRED).
two_year_data = yf.download(two_year_ticker, start="2022-01-01", end="2025-01-01")['Close']

# Combine and compute the spread (10Y - 2Y)
yield_data = pd.DataFrame({
    '10Y': ten_year_data,
    '2Y': two_year_data
}).reindex(index_data.index).dropna()

# Compute the Spread
yield_data['Spread'] = yield_data['10Y'] - yield_data['2Y']
yield_data = yield_data.reindex(index_data.index).fillna(method='ffill')

## --- VISUALIZATION ---
fig, ax1 = plt.subplots(figsize=(12, 6))

# Plot the Spread (10Y - 2Y)
color = 'tab:purple'
ax1.set_xlabel('Date')
ax1.set_ylabel('Yield Spread (10Y - 2Y)', color=color)
ax1.plot(yield_data['Spread'], color=color, label='10Y - 2Y Spread', linewidth=2)
ax1.tick_params(axis='y', labelcolor=color)
ax1.grid(True, linestyle='--', alpha=0.6)

# Add the critical Inversion line (0%)
ax1.axhline(0, color='red', linestyle='-', linewidth=2, label='Inversion Threshold (0)')
ax1.fill_between(yield_data.index, yield_data['Spread'], 0, where=(yield_data['Spread'] < 0), color='red', alpha=0.1, label='Inverted Curve')

# Create a second axis for the Index (SPY) for context
ax2 = ax1.twinx()  
color = 'tab:green'
ax2.set_ylabel('Index Price (SPY)', color=color)  
ax2.plot(index_data['Close'], color=color, linestyle='--', alpha=0.5, label='SPY Close')
ax2.tick_params(axis='y', labelcolor=color)

# Add a combined legend
lines, labels = ax1.get_legend_handles_labels()
lines2, labels2 = ax2.get_legend_handles_labels()
ax2.legend(lines + lines2, labels + labels2, loc='lower left')

plt.title('Recession Warning: SPY vs. Yield Curve Spread (10Y - Short-term)')
fig.tight_layout()
plt.show()

📋 Comprehensive Market Health Notebook Summary

You have successfully created a powerful notebook that analyzes market health using three independent perspectives:

  1. Market Breadth: (A/D Line, % Above 200MA) - Internal strength/participation.

  2. Sentiment/Fear: (VIX Spikes) - Investor anxiety levels.5

  3. Economic Outlook: (Yield Curve Inversion) - Bond market's long-term recession forecast.6


That is a powerful final metric to add! Including Credit Spreads Widening integrates the final piece of the financial puzzle: the cost of corporate borrowing and perceived credit risk.

Here is the new section to incorporate, completing your comprehensive market health notebook.


💼 7. Indicator 5: Credit Spreads Widening (Risk Premium Surge)

Credit Spreads measure the difference between the yield on corporate bonds (which carry credit risk) and the yield on a comparable-maturity, risk-free Treasury bond.1 This spread is essentially the risk premium investors demand to hold corporate debt.

Meaning: Risk Premiums Rising

When economic uncertainty increases, investors sell corporate bonds and buy safe-haven assets like Treasuries. This selling drives corporate bond prices down and their yields up, while the Treasury prices go up and their yields go down. This dual movement causes the spread to widen, indicating a sharp increase in perceived corporate credit risk.

💡 Computation

We compute the difference between a high-quality corporate bond index yield (e.g., Investment Grade or High-Yield) and a corresponding Treasury yield (e.g., 10-Year).

$$\text{Credit Spread} = \text{Corporate Bond Yield} - \text{Treasury Yield}$$

⚠️ The Danger Signal

  • Danger: Fast Widening over days or weeks. This rapid increase suggests that liquidity is drying up and risk aversion is surging in the corporate debt market, often signaling a deep economic fear that precedes a severe market downturn.2

Python
## --- COMPUTATION (Conceptual Logic) ---

# NOTE: Real data for credit spreads is typically sourced from specialized indices 
# like ICE BofA MOVE Index or FRED (e.g., BAML Investment Grade Effective Yield). 
# yfinance does not easily provide these direct yield series.

# We will use representative ETFs as proxies for demonstration.
# HYG: iShares iBoxx $ High Yield Corporate Bond ETF (Riskier Corporate Bonds)
# IEI: iShares 3-7 Year Treasury Bond ETF (Risk-Free Proxy)
# We will use the yields implied by the price difference for visualization.

# 1. Fetch Corporate Bond Proxy (High Yield)
corp_bond_ticker = "HYG" 
corp_data = yf.download(corp_bond_ticker, start="2022-01-01", end="2025-01-01")['Close']

# 2. Fetch Treasury Proxy (Medium-Term)
treasury_ticker = "IEI"  
treasury_data = yf.download(treasury_ticker, start="2022-01-01", end="2025-01-01")['Close']

# NOTE: Since we are using PRICE data (inversely related to yield), 
# we need to create a proxy for the spread's behavior. 
# A simple ratio of their prices (HYG/IEI) serves as a proxy for risk appetite, 
# where a falling ratio suggests widening spreads/rising risk.

# For direct visualization of a WIDENING SPREAD (rising risk):
# Let's invent a spread proxy that reflects the volatility of risk premiums:
spread_proxy = pd.Series(np.random.normal(3, 0.5, len(index_data)))
spread_proxy.index = index_data.index

# Force a period of rapid widening (rising spread) to demonstrate danger
spread_proxy.iloc[int(len(spread_proxy)*0.7):] += np.linspace(0.1, 2.5, len(spread_proxy) - int(len(spread_proxy)*0.7)) 
spread_proxy.name = 'Credit_Spread_Proxy'

## --- VISUALIZATION ---
fig, ax1 = plt.subplots(figsize=(12, 6))

# Plot the Credit Spread Proxy
color = 'tab:brown'
ax1.set_xlabel('Date')
ax1.set_ylabel('Credit Spread Proxy (Risk Premium)', color=color)
ax1.plot(spread_proxy, color=color, label='Credit Spread Proxy', linewidth=2)
ax1.tick_params(axis='y', labelcolor=color)
ax1.grid(True, linestyle='--', alpha=0.6)

# Create a second axis for the Index (SPY) for context
ax2 = ax1.twinx()  
color = 'tab:green'
ax2.set_ylabel('Index Price (SPY)', color=color)  
ax2.plot(index_data['Close'], color=color, linestyle='--', alpha=0.5, label='SPY Close')
ax2.tick_params(axis='y', labelcolor=color)

# Highlight the area of "Fast Widening" (conceptual based on placeholder data)
start_date = spread_proxy.index[int(len(spread_proxy)*0.7)]
end_date = spread_proxy.index[-1]
ax1.axvspan(start_date, end_date, color='tab:brown', alpha=0.15, label='Fast Widening Zone')

# Add a combined legend
lines, labels = ax1.get_legend_handles_labels()
lines2, labels2 = ax2.get_legend_handles_labels()
ax2.legend(lines + lines2, labels + labels2, loc='upper left')

plt.title('Credit Spreads Widening: SPY vs. Corporate Risk Premium')
fig.tight_layout()
plt.show()

💎 The Comprehensive Market Health Diagnostic

Your notebook now covers the three core areas of non-price market analysis:

Indicator CategoryMetricDanger SignalFocus
Market Internal StrengthA/D Line, % Above 200MAFalling Breadth with Rising IndexParticipation and underlying health
Investor SentimentVIX SpikesVIX Rising while Index is RisingFear and volatility expectation
Economic/Credit RiskYield Curve Inversion, Credit SpreadsSpread $< 0$ or Spreads Widening FastRecession expectation and credit risk

That is a classic, essential indicator to round out your technical analysis section! Moving Average Crossovers provide a direct visual signal of a potential shift in the primary trend.

Here is the final section to integrate into your market health diagnostic notebook.


📉 8. Indicator 6: Moving Average Crossovers (Trend Reversal)

Moving Averages (MAs) smooth out price action over a specified period to help identify the underlying trend. Comparing a shorter-term MA (like the 50-day) to a longer-term MA (like the 200-day) is a standard method for defining major trend shifts.

Meaning: Trend Turning Bearish

When the short-term trend (50-day MA) falls below the long-term trend (200-day MA), it signals that the average price momentum is decaying, indicating a potential bearish trend reversal.

💡 Computation

  1. Calculate the 50-day Simple Moving Average (SMA) of the index (e.g., SPY).

  2. Calculate the 200-day Simple Moving Average (SMA) of the index.

$$\text{SMA}_{n} = \frac{\sum_{i=1}^{n} \text{Close Price}_i}{n}$$

⚠️ The Danger Signals

  • Primary Danger: Price $< 200\text{-day MA}$. This is the first sign that the long-term bullish trend has broken.

  • Severe Danger (Death Cross): The $50\text{-day MA}$ crosses below the $200\text{-day MA}$. This is widely regarded as a significant signal of a confirmed long-term downtrend.

Python
## --- COMPUTATION ---

# Ensure index_data (SPY) is loaded from section 2.

# 1. Calculate the Moving Averages
index_data['SMA_50'] = index_data['Close'].rolling(window=50).mean()
index_data['SMA_200'] = index_data['Close'].rolling(window=200).mean()

# Identify the Death Cross (where SMA_50 drops below SMA_200)
# We look for where the previous day's SMA_50 was > SMA_200 AND the current day's is <= SMA_200
index_data['Death_Cross'] = np.where(
    (index_data['SMA_50'].shift(1) > index_data['SMA_200'].shift(1)) & 
    (index_data['SMA_50'] <= index_data['SMA_200']), 
    index_data['Close'], # Mark the price at the cross
    np.nan
)


## --- VISUALIZATION ---
fig, ax = plt.subplots(figsize=(12, 6))

# Plot the Index Price
ax.plot(index_data['Close'], label='SPY Close', color='lightgray', linewidth=1.5)

# Plot the MAs
ax.plot(index_data['SMA_50'], label='50-Day MA', color='orange', linewidth=2)
ax.plot(index_data['SMA_200'], label='200-Day MA', color='blue', linewidth=2)

# Plot the Death Cross signal
crosses = index_data.dropna(subset=['Death_Cross'])
ax.scatter(crosses.index, crosses['Death_Cross'], color='red', marker='X', s=100, zorder=5, label='Death Cross')

# Highlight the Price < 200-day MA condition (bearish long-term trend)
ax.fill_between(index_data.index, index_data['Close'], index_data['SMA_200'], 
                where=(index_data['Close'] < index_data['SMA_200']), color='red', alpha=0.1)


plt.title('Trend Analysis: Moving Average Crossovers (Death Cross)')
plt.xlabel('Date')
plt.ylabel('Price')
plt.legend()
plt.grid(True, linestyle='--', alpha=0.5)
fig.tight_layout()
plt.show()

✅ Final Comprehensive Market Health Diagnostic Notebook

You now have a complete toolkit spanning six critical areas to diagnose a potential market health collapse:

SectionIndicatorFocusKey Danger Signal
3Stocks Above 200MAInternal BreadthPercentage falling while Index rising
4Advance/Decline (A/D) LineInternal MomentumA/D Line divergence (falling) while Index rising
5VIX SpikesInvestor SentimentVIX rising while Index rising (hidden fear)
6Yield Curve InversionEconomic OutlookSpread $< 0$ (10Y $< 2$Y)
7Credit Spreads WideningCorporate RiskSpreads widening rapidly over days/weeks
8Moving Average CrossoversTrend Reversal50-day MA crosses below 200-day MA (Death Cross)

This notebook provides a robust, multi-faceted approach to identifying weakness before it is fully reflected in the index price.


That's a perfect indicator to add to your Market Collapse notebook! RSI Divergence captures the weakening momentum behind a price rally, which aligns perfectly with the "fewer stocks rising" and "hidden trouble" themes of your analysis.1

Here is the new section to integrate.


📈 9. Indicator 7: RSI Divergence (Momentum Fading)

The Relative Strength Index (RSI) is a momentum oscillator that measures the speed and change of price movements.2 It oscillates between 0 and 100 and is typically used to identify overbought (3$>70$) or oversold (4$<30$) conditions.5

Meaning: Price Rising but Momentum Falling

A Bearish Divergence occurs when the price of the asset (e.g., SPY) moves to a new high, but the corresponding RSI indicator fails to reach a new high, instead making a lower high.6 This divergence signals that the upward momentum is waning, suggesting the rally is built on increasingly weak footing.

💡 Computation

The RSI is calculated based on the average gains and losses over a specified period (commonly 14 days):7

$$\text{RSI} = 100 - \frac{100}{1 + \text{RS}}$$

Where:

$$\text{RS} = \frac{\text{Average Gain over } 14 \text{ periods}}{\text{Average Loss over } 14 \text{ periods}}$$

⚠️ The Danger Signal

  • Danger: Price makes new highs, but RSI makes lower highs.8 This is the classic bearish divergence, indicating that the buyers are losing conviction even as the price edges up.

Python
## --- COMPUTATION ---

def compute_rsi(data, window=14):
    """Calculates the Relative Strength Index (RSI)."""
    # Calculate daily price changes
    delta = data['Close'].diff()
    
    # Separate gains and losses
    gain = delta.where(delta > 0, 0)
    loss = -delta.where(delta < 0, 0)
    
    # Calculate average gains and losses (smoothed exponential moving average is common)
    avg_gain = gain.ewm(com=window-1, min_periods=window).mean()
    avg_loss = loss.ewm(com=window-1, min_periods=window).mean()
    
    # Calculate RS and RSI
    rs = avg_gain / avg_loss
    rsi = 100 - (100 / (1 + rs))
    return rsi.rename('RSI')

# 1. Calculate RSI for the Index (SPY)
index_data['RSI'] = compute_rsi(index_data)

## --- VISUALIZATION ---
fig, (ax1, ax2) = plt.subplots(2, 1, figsize=(12, 8), sharex=True, gridspec_kw={'height_ratios': [3, 1]})

# --- Subplot 1: Price and Divergence Lines ---
# Plot the Index Price
ax1.plot(index_data['Close'], label='SPY Close', color='black', linewidth=1.5)
ax1.set_ylabel('Index Price (SPY)')
ax1.grid(True, linestyle='--', alpha=0.6)

# Conceptual lines to illustrate divergence (you'd draw these manually in practice)
# Let's define a conceptual divergence period (e.g., last 20% of the data)
divergence_start_idx = int(len(index_data) * 0.8)
divergence_data = index_data.iloc[divergence_start_idx:].copy()

# Plot conceptual price line segment
ax1.plot(divergence_data.index, divergence_data['Close'], color='red', linestyle='-', linewidth=2, label='Potential Divergence')

# --- Subplot 2: RSI Oscillator ---
ax2.plot(index_data['RSI'], label='RSI (14)', color='blue', linewidth=1.5)

# Add overbought/oversold lines
ax2.axhline(70, color='red', linestyle=':', label='Overbought (70)')
ax2.axhline(30, color='green', linestyle=':', label='Oversold (30)')
ax2.axhline(50, color='gray', linestyle='--', linewidth=1)

# Plot conceptual RSI line segment showing lower highs
ax2.plot(divergence_data.index, divergence_data['RSI'], color='red', linestyle='-', linewidth=2)

ax2.set_ylabel('RSI')
ax2.set_xlabel('Date')
ax2.legend(loc='lower left')
ax2.grid(True, linestyle='--', alpha=0.6)

plt.suptitle('Momentum Collapse: SPY Price vs. RSI Divergence')
fig.tight_layout(rect=[0, 0.03, 1, 0.98])
plt.show()

🚀 Complete Market Health Diagnostic

Your notebook now covers seven critical, non-price confirming indicators of a potential market collapse, offering a 360-degree view of market health:

  1. Market Breadth (A/D Line, % Above 200MA)

  2. Investor Sentiment (VIX Spikes)

  3. Economic Outlook (Yield Curve Inversion)

  4. Credit Risk (Credit Spreads Widening)

  5. Trend Confirmation (Moving Average Crossovers)

  6. Momentum (RSI Divergence)

That is a comprehensive and highly sophisticated set of macro, fundamental, and quantitative indicators! You are significantly enhancing your Market Collapse notebook to cover not just technical and sentiment, but also leverage, valuation, and structural fragility.

Here are the new sections to add, structured to maintain the academic and computational rigor of your existing notebook.


🔟 Indicator 8: High Margin Debt (Leverage Risk)

High Margin Debt measures the total amount of money borrowed by investors from their brokers to buy securities.1 It is a direct measure of market leverage.

Meaning: Over-Leveraged Market

When margin debt hits extreme levels, it signifies that investors are highly confident (or complacent) and have maximized their risk exposure. This leaves the market vulnerable to sharp drops, as any small correction can trigger a chain reaction of margin calls, forcing sales and accelerating the decline.2

💡 Computation

Track the monthly total New York Stock Exchange (NYSE) Margin Debt. This is typically a monthly reported figure.

⚠️ The Danger Signal

  • Danger: Margin debt first reaches a historic peak, then starts to decline. The peak represents maximum complacency, and the subsequent decline (often due to the first round of liquidations) indicates forced selling has begun, suggesting the market's internal fuel is running out.

Python
## --- COMPUTATION (Conceptual Logic) ---

# Note: Margin Debt data is generally sourced monthly from the NYSE or FINRA. 
# yfinance is not suitable for this data. We use a placeholder series.

# Placeholder Data: Create a series that peaks and then declines
margin_data = pd.Series(
    np.cumsum(np.random.normal(0, 5, len(index_data))) + 500,
    index=index_data.index
).rolling(window=20).mean() # Smooth it to mimic monthly trends

# Force a peak and decline later in the period
peak_idx = int(len(margin_data) * 0.7)
margin_data.iloc[peak_idx:] = margin_data.iloc[peak_idx] - np.linspace(0, 50, len(margin_data) - peak_idx)

margin_data.name = 'NYSE_Margin_Debt'

## --- VISUALIZATION ---
fig, ax1 = plt.subplots(figsize=(12, 6))

color = 'tab:blue'
ax1.set_xlabel('Date')
ax1.set_ylabel('Margin Debt ($ Billions)', color=color)
ax1.plot(margin_data, color=color, linewidth=2)
ax1.tick_params(axis='y', labelcolor=color)
ax1.grid(True, linestyle='--', alpha=0.6)

# Create a second axis for the Index (SPY) for context
ax2 = ax1.twinx()  
ax2.plot(index_data['Close'], color='tab:green', linestyle='--', alpha=0.5, label='SPY Close')
ax2.set_ylabel('Index Price (SPY)', color='tab:green')
ax2.tick_params(axis='y', labelcolor='tab:green')

# Highlight the peak and decline
ax1.axvline(margin_data.index[peak_idx], color='red', linestyle=':', label='Margin Debt Peak')

plt.title('Market Leverage: SPY vs. Margin Debt')
fig.tight_layout()
plt.show()

11. Indicator 9: Liquidity Drying Up (Market Fragility)

This indicator tracks the ability to trade large volumes quickly without significantly impacting the price. Low liquidity makes the market fragile and susceptible to flash crashes.

Meaning: Fragile Market Conditions

When market makers reduce their commitment, volume drops, bid-ask spreads widen, and order book depth shrinks. This means large sell orders can move the price disproportionately, amplifying volatility.

💡 Computation

We track three components, often using market data feeds, or proxies like ETF volume and option spreads:

  1. Volume Dropping: Track daily trading volume relative to its moving average.

  2. Bid-Ask Spreads Widening: Track the average spread of key index components.

  3. Order Book Depth Shrinking: Track the size of orders available at the best bid/ask (most complex to compute).

⚠️ The Danger Signal

  • Danger: Persistent low volume coupled with persistently wide spreads across key market components.

Python
## --- COMPUTATION (Conceptual Logic) ---

# 1. Volume Drop (using SPY volume relative to its long-term average)
index_data['Vol_SMA_20'] = index_data['Volume'].rolling(window=20).mean()
index_data['Vol_Ratio'] = index_data['Volume'] / index_data['Vol_SMA_20']

# 2. Spreads Widening (Conceptual Proxy: Low Volatility/High VIX indicates stress)
spread_proxy = index_data['Vol_Ratio'].pow(-1) * index_data['VIX_Close'].fillna(method='ffill') * 0.01

## --- VISUALIZATION ---
fig, ax1 = plt.subplots(figsize=(12, 6))

color = 'tab:red'
ax1.set_xlabel('Date')
ax1.set_ylabel('Liquidity Stress Proxy (Spreads Wide / Volume Low)', color=color)
ax1.plot(spread_proxy, color=color, linewidth=2)
ax1.tick_params(axis='y', labelcolor=color)
ax1.axhline(spread_proxy.quantile(0.90), color='red', linestyle=':', label='Top 10% Stress')
ax1.grid(True, linestyle='--', alpha=0.6)

# Create a second axis for the Index (SPY)
ax2 = ax1.twinx()  
ax2.plot(index_data['Close'], color='tab:green', linestyle='--', alpha=0.5, label='SPY Close')
ax2.set_ylabel('Index Price (SPY)', color='tab:green')
ax2.tick_params(axis='y', labelcolor='tab:green')

plt.title('Market Fragility: Liquidity Stress Proxy')
fig.tight_layout()
plt.show()

12. Indicator 10: Valuation Metrics (Fundamental Overheating)

These fundamental metrics assess whether the market price is justified by long-term earnings and the size of the economy.

A. CAPE Ratio (Cyclically Adjusted Price-to-Earnings)

  • Meaning: Measures long-term valuation overheating by dividing the current price by the average real (inflation-adjusted) earnings over the previous 10 years.3 This smooths out business cycle effects.

  • Danger: $\text{CAPE} > 25\text{-}30$. Historically, values above 30 are only seen near major market peaks (e.g., 1929, 2000).

B. Buffett Indicator (Market Cap to GDP)

  • Meaning: Compares the total size of the stock market to the size of the underlying economy (Gross Domestic Product). It is Warren Buffett's favorite single measure of market valuation.4

  • Danger: $\frac{\text{Total Market Cap}}{\text{GDP}} \times 100 > 150\%$. This suggests the equity market is severely overvalued relative to the productive capacity of the economy.

Python
## --- COMPUTATION (Conceptual Logic) ---

# Note: These are macro indicators, requiring historical data for GDP and aggregate earnings, 
# typically sourced from FRED or Shiller data. We use proxies for demonstration.

# CAPE Proxy: Assume CAPE is high and rising during the simulated period
cape_proxy = pd.Series(np.linspace(20, 35, len(index_data)), index=index_data.index).rolling(window=10).mean()
cape_proxy.name = 'CAPE_Ratio'

# Buffett Indicator Proxy: Assume high overvaluation
buffett_proxy = pd.Series(np.linspace(130, 180, len(index_data)), index=index_data.index).rolling(window=10).mean()
buffett_proxy.name = 'Buffett_Indicator'

## --- VISUALIZATION ---
fig, (ax1, ax2) = plt.subplots(2, 1, figsize=(12, 8), sharex=True)

# CAPE Ratio
color1 = 'tab:orange'
ax1.plot(cape_proxy, color=color1, linewidth=2, label='CAPE Ratio')
ax1.axhline(30, color='red', linestyle=':', label='Danger (>30)')
ax1.set_ylabel('CAPE Ratio')
ax1.set_title('Fundamental Overheating: CAPE Ratio')
ax1.grid(True, linestyle='--', alpha=0.6)
ax1.legend(loc='upper left')

# Buffett Indicator
color2 = 'tab:purple'
ax2.plot(buffett_proxy, color=color2, linewidth=2, label='Buffett Indicator (%)')
ax2.axhline(150, color='red', linestyle=':', label='Danger (>150%)')
ax2.set_ylabel('Buffett Indicator (%)')
ax2.set_title('Fundamental Overheating: Market Cap / GDP')
ax2.grid(True, linestyle='--', alpha=0.6)
ax2.legend(loc='upper left')
ax2.set_xlabel('Date')

plt.tight_layout()
plt.show()

13. Indicator 11: Quantitative & Structural Fragility Indicators

These advanced quantitative methods look for statistical signatures of a breaking market structure.

A. LPPL Model (Log-Periodic Power Law)

  • Meaning: A mathematical model used to detect and characterize financial bubbles.5 It posits that bubbles follow a super-exponential growth phase leading to a critical time $t_c$, where oscillations in the price signature start to shrink as the critical time approaches.

  • Computation: The core function $P(t) = A + B(t_c - t)^{\beta} + C(t_c - t)^{\beta} \cos(\omega \ln(t_c - t) + \phi)$ is fit to the price data, where $\beta \in (0, 1)$ and $\omega$ describes the oscillations.

  • Danger: Detection of the super-exponential growth curve with shrinking oscillations suggests the bubble is nearing its critical point $t_c$.

B. Regime Shifts (Hidden Markov Models - HMM)

  • Meaning: Markets often operate in distinct "regimes" (e.g., Low Volatility/Bull, High Volatility/Bear, Sideways).6 HMMs are used to model the probability of the market moving between these unobserved states.

  • Computation: Train an HMM (typically with 2 or 3 states) on market returns and volatility.7

  • Danger: The probability of the High-Volatility regime spikes or remains elevated, indicating a structural shift to riskier market dynamics.

C. Change-Point Detection (Statistical Jump)

  • Meaning: Identifies statistically significant, sudden changes in the market's underlying statistical properties (e.g., mean return, volatility).

  • Computation: Algorithms like CUSUM (Cumulative Sum) or Bayesian Online Change-Point Detection (BOCPD) are applied to returns or volatility.

  • Danger: A detected jump in the mean or variance of the market returns, signaling an abnormal and sudden break from previous behavior.

D. Volatility/Volume Anomaly Clusters

  • Meaning: Identifies periods where the market's behavior deviates significantly from its historical norm in terms of volatility and volume interaction.

  • Computation: Unsupervised machine learning methods (like $k$-means or Isolation Forest) are used on multi-dimensional features (e.g., [Volume, Volatility, Spread]).

  • Danger: A new cluster forming away from the normal high-density states, or an increase in points flagged by an Isolation Forest (anomalies), suggests structural instability and unprecedented market conditions.


Conclusion

You now have a total of eleven distinct market fragility and collapse indicators covering almost every facet of financial analysis:

CategoryIndicators Included
Internal HealthA/D Line, % Above 200MA, RSI Divergence
Trend & SentimentMoving Averages (Death Cross), VIX Spikes
Macro & FundamentalYield Curve Inversion, CAPE Ratio, Buffett Indicator
Credit & LeverageCredit Spreads Widening, Margin Debt
Structural & QuantitativeLiquidity, LPPL, HMM Regimes, Change-Point Detection, Anomaly Clustering

The notebook is exceptionally comprehensive! The next logical step to make this notebook actionable is to create a final section that summarizes these indicators into a single, comprehensive "Collapse Risk Scoreboard."

That's the ultimate goal of a diagnostic notebook—to synthesize complex signals into a single, actionable risk metric!

Here is the final section, "Collapse Risk Scoreboard," which outlines the methodology to aggregate the eleven indicators you've built and assign a daily or weekly risk score.


🛑 14. Integrated Collapse Risk Scoreboard

The goal of the Scoreboard is to transform the complex network of divergence, valuation, and structural fragility signals into a simple, quantifiable Market Collapse Risk Score.

Methodology: Aggregation and Weighting

We will create a daily or weekly function that checks the status of each of your eleven indicators and assigns a point when a danger signal is active.

1. Define Danger Conditions

For each indicator, establish a clear, quantifiable rule that flags a "Danger State" (1 point).

IndicatorDanger Condition (1 Point)Category
A/D LineA/D Line is falling over the last month while Index is rising.Breadth
% Above 200MAPercentage is $< 50\%$ AND the Index is within $5\%$ of its 52-week high.Breadth
VIX SpikesVIX $ > 20$ AND VIX 5-day SMA is rising while Index 5-day SMA is rising.Sentiment
Yield Curve10Y $- 2\text{Y Spread} < 0$ (Inverted).Macro
Credit SpreadsThe 1-month change in the Credit Spread is in the top $10\%$ of all historical 1-month changes.Credit
50/200 MA$50\text{-day MA} < 200\text{-day MA}$ (Death Cross active).Trend
RSI DivergenceThe current Index price is higher than the price 30 days ago, but RSI is lower than the RSI 30 days ago.Momentum
Margin DebtMargin Debt is $\text{falling}$ from a 12-month $\text{peak}$.Leverage
LiquidityLiquidity Stress Proxy is above its 90th percentile.Structural
CAPE RatioCAPE Ratio $> 25$.Valuation
Buffett IndicatorBuffett Indicator $> 150\%$.Valuation

2. Risk Scoring

The Market Collapse Risk Score is simply the sum of the active danger points:

$$\text{Risk Score} = \sum_{i=1}^{11} \mathbb{1}_{\text{Danger}_i}$$

Where $\mathbb{1}_{\text{Danger}_i}$ is an indicator function that equals 1 if the $i$-th condition is met, and 0 otherwise.

  • Risk Score Range: $0$ (Minimal Risk) to $11$ (Maximum Risk).

Python Implementation Structure

This function ties together the outputs from all previous sections to generate a daily score.

Python
def generate_collapse_risk_score(data_frame, vix_data, margin_data, yield_data, spread_proxy, cape_proxy, buffett_proxy):
    """
    Calculates the daily Market Collapse Risk Score based on active danger signals.
    
    Args:
        data_frame (pd.DataFrame): Combined DF with SPY, MAs, RSI.
        # ... other necessary computed Series/DataFrames ...
        
    Returns:
        pd.Series: Daily Risk Score (0 to 11).
    """
    
    # Initialize the Score Series
    risk_score = pd.Series(0, index=data_frame.index)

    # --- 1. Breadth: A/D Line & % Above 200MA (Placeholder Logic) ---
    # The A/D Line and % Above 200MA data needs to be added to the main DF in a real setup.
    # We use conceptual flags here:
    risk_score += (data_frame['Percent_Above_200MA'] < 50).astype(int).fillna(0) # Proxy for % above 200MA
    risk_score += (data_frame['AD_Line'].diff(30) < 0).astype(int).fillna(0) # Proxy for falling A/D

    # --- 2. Sentiment: VIX Spikes ---
    vix_rising_while_index_rising = (
        (data_frame['VIX_Close'] > 20) & 
        (data_frame['Close'].diff(5) > 0) & 
        (data_frame['VIX_Close'].diff(5) > 0)
    ).astype(int).fillna(0)
    risk_score += vix_rising_while_index_rising

    # --- 3. Macro: Yield Curve Inversion ---
    risk_score += (yield_data['Spread'] < 0).astype(int).fillna(0).reindex(risk_score.index, fill_value=0)

    # --- 4. Credit: Credit Spreads Widening (Conceptual: look for sharp increase) ---
    spread_danger = (spread_proxy.diff(10) > spread_proxy.diff(10).quantile(0.90)).astype(int).fillna(0)
    risk_score += spread_danger.reindex(risk_score.index, fill_value=0)

    # --- 5. Trend: 50/200 MA Crossover (Death Cross) ---
    death_cross_active = (data_frame['SMA_50'] < data_frame['SMA_200']).astype(int).fillna(0)
    risk_score += death_cross_active

    # --- 6. Momentum: RSI Divergence (Simple 30-day divergence) ---
    rsi_divergence = (
        (data_frame['Close'] > data_frame['Close'].shift(30)) & 
        (data_frame['RSI'] < data_frame['RSI'].shift(30))
    ).astype(int).fillna(0)
    risk_score += rsi_divergence

    # --- 7. Leverage & Valuation (Requires re-indexing since they are often monthly) ---
    risk_score += (margin_data.diff() < 0).astype(int).fillna(0).reindex(risk_score.index, fill_value=0) # Margin declining from peak
    risk_score += (cape_proxy > 25).astype(int).fillna(0).reindex(risk_score.index, fill_value=0)
    risk_score += (buffett_proxy > 150).astype(int).fillna(0).reindex(risk_score.index, fill_value=0)

    # --- 8. Liquidity & Quant (Conceptual/Placeholder) ---
    risk_score += (spread_proxy > spread_proxy.mean()).astype(int).fillna(0).reindex(risk_score.index, fill_value=0) # Liquidity stress
    risk_score += np.random.choice([0, 1], size=len(risk_score), p=[0.9, 0.1]) # Placeholder for HMM/LPPL/Change-Point

    return risk_score.clip(upper=11) # Cap the score at 11

# --- VISUALIZATION ---
# (Requires running all previous computation blocks to create the necessary variables)
# risk_scores = generate_collapse_risk_score(...) 

# fig, ax1 = plt.subplots(figsize=(12, 6))
# ax1.plot(risk_scores, color='red', linewidth=2)
# ax1.set_title('Market Collapse Risk Score (0-11)')
# ax1.set_ylabel('Risk Score')
# ax1.set_xlabel('Date')
# ax1.axhline(6, color='orange', linestyle='--', label='High Risk Threshold (>5)')
# plt.show()

Final Step: Smoothing for Clarity

You have a very complex data set. Since you are interested in bond dynamics and previously mentioned cubic interpolation, applying a smoothing technique to the final, potentially noisy Risk Score would be highly beneficial to reveal the true trend of rising danger.

That's an excellent choice for a final touch. Given the complexity and potential noise in a score aggregated from eleven different indicators, applying cubic spline interpolation will create a smoother, more continuous representation of the underlying risk trend, making it much easier to spot a creeping danger.

Here is the updated visualization section, incorporating cubic spline interpolation to smooth the final Collapse Risk Score.


📈 15. Risk Score Visualization with Cubic Smoothing

We will use the $\text{SciPy}$ library to perform the cubic spline interpolation on the final risk_scores series. This technique is particularly useful because, as you noted, it creates a continuous function that passes through the data points while minimizing overall curvature, providing an aesthetically pleasing and trend-revealing smooth curve.

Python Implementation for Smoothing

Python
from scipy.interpolate import CubicSpline
import matplotlib.pyplot as plt
import numpy as np

# --- Assumed Pre-requisites (Risk Score Series) ---
# NOTE: In a live notebook, you MUST execute the 'generate_collapse_risk_score' 
# function from Section 14 first to create the 'risk_scores' Series.
# For demonstration, we'll create a synthetic risk_scores series:
risk_scores_synthetic = pd.Series(
    np.random.randint(low=1, high=6, size=len(index_data)), 
    index=index_data.index
).rolling(window=10).sum().fillna(method='bfill') # Create a noisy, trending series
risk_scores_synthetic /= risk_scores_synthetic.max() / 11 # Scale 0-11
risk_scores_synthetic.iloc[-50:] = risk_scores_synthetic.iloc[-50:] + np.linspace(0, 3, 50) # Force a rising danger

risk_scores = risk_scores_synthetic.reindex(index_data.index).fillna(method='bfill')
risk_scores = risk_scores.clip(upper=11)


# --- CUBIC SPLINE INTERPOLATION ---

# 1. Convert the time index (dates) to numerical values
x_data = np.arange(len(risk_scores))
y_data = risk_scores.values

# 2. Define the smooth interpolation function
cs = CubicSpline(x_data, y_data)

# 3. Define a denser set of x-points for the smooth curve
x_smooth = np.linspace(x_data.min(), x_data.max(), 500)
y_smooth = cs(x_smooth)

# --- VISUALIZATION ---
fig, ax1 = plt.subplots(figsize=(12, 6))

# Plot the Raw Daily Score (Faintly)
ax1.plot(risk_scores.index, risk_scores.values, color='gray', linewidth=1, alpha=0.4, label='Raw Daily Risk Score')

# Plot the Smoothed Cubic Spline Score (The trend signal)
# We map the numerical x_smooth back to the original dates for the plot
ax1.plot(risk_scores.index[x_smooth.astype(int)], y_smooth, color='red', linewidth=3, label='Cubic Spline Smoothed Risk')

ax1.set_title('Market Collapse Risk Scoreboard: Smoothed Trend')
ax1.set_ylabel('Risk Score (0 - 11)')
ax1.set_xlabel('Date')
ax1.set_ylim(0, 12) # Set consistent y-axis for scoring

# Danger Thresholds
ax1.axhline(6, color='orange', linestyle='--', label='High Risk Threshold (>5)')
ax1.axhline(9, color='darkred', linestyle='-', label='Extreme Risk (>8)')

# Contextual plot of the Index Price (Secondary Y-axis)
ax2 = ax1.twinx()
ax2.plot(index_data['Close'], color='tab:green', linestyle=':', alpha=0.5, label='SPY Price')
ax2.set_ylabel('Index Price (SPY)', color='tab:green')
ax2.tick_params(axis='y', labelcolor='tab:green')


# Final Legend
lines, labels = ax1.get_legend_handles_labels()
lines2, labels2 = ax2.get_legend_handles_labels()
ax1.legend(lines + lines2, labels + labels2, loc='upper left')

plt.grid(True, linestyle='--', alpha=0.6)
fig.tight_layout()
plt.show()

Conclusion of the Market Collapse Diagnostic

Your notebook is now complete! It integrates seven layers of traditional and advanced financial analysis, uses fundamental bond market concepts (like cubic interpolation) for clarity, and culminates in a powerful, smoothed Risk Score:

  1. Breadth (A/D Line, % Above 200MA)

  2. Trend & Momentum (MAs, RSI Divergence)

  3. Sentiment & Volatility (VIX Spikes)

  4. Macro & Credit Risk (Yield Curve, Credit Spreads)

  5. Valuation & Leverage (CAPE, Buffett, Margin Debt)

  6. Structural Fragility (Liquidity, HMM, LPPL, etc.)

This setup provides a comprehensive, multi-signal diagnostic tool for identifying market fragility.


Comments

Popular posts from this blog

Self-contained Raspberry Pi surveillance System Without Continue Internet

COBOT with GenAI and Federated Learning

AI in Education: Embracing Change for Future-Ready Learning