0
# Portfolio Optimization
1
2
Modern portfolio theory implementations including mean-variance optimization, risk parity approaches, and weight constraint utilities. Provides sophisticated portfolio construction algorithms for quantitative asset allocation.
3
4
## Capabilities
5
6
### Mean-Variance Optimization
7
8
Classic Markowitz mean-variance optimization for finding optimal portfolio weights.
9
10
```python { .api }
11
def calc_mean_var_weights(returns, weight_bounds=(0.0, 1.0), rf=0.0, covar_method="ledoit-wolf", options=None):
12
"""
13
Calculate mean-variance optimization weights (maximum Sharpe ratio portfolio).
14
15
Parameters:
16
- returns (pd.DataFrame): Return series for assets
17
- weight_bounds (tuple): Min and max weight bounds (default: (0.0, 1.0))
18
- rf (float): Risk-free rate (default: 0.0)
19
- covar_method (str): Covariance estimation method ('ledoit-wolf', 'empirical', 'oas')
20
- options (dict): Additional optimization options
21
22
Returns:
23
pd.Series: Optimal portfolio weights indexed by asset names
24
"""
25
```
26
27
### Risk Parity Optimization
28
29
Equal risk contribution portfolio construction methods.
30
31
```python { .api }
32
def calc_erc_weights(returns, initial_weights=None, risk_weights=None, covar_method="ledoit-wolf", risk_parity_method="ccd", maximum_iterations=100, tolerance=1e-8):
33
"""
34
Calculate equal risk contribution (ERC) weights where each asset contributes equally to portfolio risk.
35
36
Parameters:
37
- returns (pd.DataFrame): Return series for assets
38
- initial_weights (pd.Series): Starting weights for optimization (default: equal weights)
39
- risk_weights (pd.Series): Target risk contribution weights (default: equal risk)
40
- covar_method (str): Covariance estimation method ('ledoit-wolf', 'empirical', 'oas')
41
- risk_parity_method (str): Risk parity algorithm ('ccd' - cyclical coordinate descent)
42
- maximum_iterations (int): Maximum optimization iterations (default: 100)
43
- tolerance (float): Convergence tolerance (default: 1e-8)
44
45
Returns:
46
pd.Series: ERC portfolio weights indexed by asset names
47
"""
48
49
def calc_inv_vol_weights(returns):
50
"""
51
Calculate inverse volatility weights (simple risk parity approach).
52
53
Parameters:
54
- returns (pd.DataFrame): Return series for assets
55
56
Returns:
57
pd.Series: Inverse volatility weights indexed by asset names
58
"""
59
```
60
61
### Weight Utilities
62
63
Utility functions for weight manipulation and constraint handling.
64
65
```python { .api }
66
def limit_weights(weights, limit=0.1):
67
"""
68
Apply maximum weight limits and redistribute excess proportionally.
69
70
Parameters:
71
- weights (pd.Series): Portfolio weights
72
- limit (float): Maximum weight per asset (default: 0.1 for 10%)
73
74
Returns:
75
pd.Series: Adjusted weights respecting limits
76
"""
77
78
def random_weights(n, bounds=(0.0, 1.0), total=1.0):
79
"""
80
Generate random portfolio weights for testing or Monte Carlo simulation.
81
82
Parameters:
83
- n (int): Number of assets
84
- bounds (tuple): Weight bounds for each asset (default: (0.0, 1.0))
85
- total (float): Total weight sum (default: 1.0)
86
87
Returns:
88
np.array: Random weights summing to total
89
"""
90
```
91
92
## Usage Examples
93
94
### Mean-Variance Optimization
95
96
```python
97
import ffn
98
import pandas as pd
99
100
# Download asset data
101
tickers = ['AAPL', 'MSFT', 'GOOGL', 'AMZN', 'TSLA']
102
prices = ffn.get(tickers, start='2020-01-01')
103
returns = ffn.to_returns(prices).dropna()
104
105
# Calculate mean-variance optimal weights
106
mv_weights = ffn.calc_mean_var_weights(returns, rf=0.02)
107
print("Mean-Variance Optimal Weights:")
108
print(mv_weights.round(4))
109
110
# Apply weight limits
111
limited_weights = ffn.limit_weights(mv_weights, limit=0.3)
112
print("\nWith 30% Weight Limit:")
113
print(limited_weights.round(4))
114
115
# Portfolio performance
116
portfolio_returns = (returns * mv_weights).sum(axis=1)
117
portfolio_sharpe = ffn.calc_sharpe(portfolio_returns, rf=0.02)
118
print(f"\nPortfolio Sharpe Ratio: {portfolio_sharpe:.3f}")
119
```
120
121
### Risk Parity Optimization
122
123
```python
124
import ffn
125
126
# Download diversified asset data
127
assets = ['VTI', 'VEA', 'VWO', 'BND', 'VNQ'] # Stocks, Bonds, REITs
128
prices = ffn.get(assets, start='2015-01-01')
129
returns = ffn.to_returns(prices).dropna()
130
131
# Equal Risk Contribution weights
132
erc_weights = ffn.calc_erc_weights(returns)
133
print("Equal Risk Contribution Weights:")
134
print(erc_weights.round(4))
135
136
# Inverse volatility weights (simpler approach)
137
inv_vol_weights = ffn.calc_inv_vol_weights(returns)
138
print("\nInverse Volatility Weights:")
139
print(inv_vol_weights.round(4))
140
141
# Compare portfolio risk
142
erc_returns = (returns * erc_weights).sum(axis=1)
143
inv_vol_returns = (returns * inv_vol_weights).sum(axis=1)
144
145
print(f"\nERC Portfolio Volatility: {erc_returns.std() * (252**0.5):.3f}")
146
print(f"Inv Vol Portfolio Volatility: {inv_vol_returns.std() * (252**0.5):.3f}")
147
```
148
149
### Custom Risk Targets
150
151
```python
152
import ffn
153
import pandas as pd
154
155
# Multi-asset portfolio
156
assets = ['SPY', 'QQQ', 'IWM', 'TLT', 'GLD']
157
prices = ffn.get(assets, start='2018-01-01')
158
returns = ffn.to_returns(prices).dropna()
159
160
# Custom risk allocation: 60% equity risk, 40% alternative risk
161
equity_assets = ['SPY', 'QQQ', 'IWM']
162
alt_assets = ['TLT', 'GLD']
163
164
# Target risk weights
165
risk_weights = pd.Series(index=returns.columns)
166
risk_weights[equity_assets] = 0.60 / len(equity_assets) # Equal risk within equity
167
risk_weights[alt_assets] = 0.40 / len(alt_assets) # Equal risk within alternatives
168
169
# Calculate ERC weights with custom risk targets
170
custom_erc_weights = ffn.calc_erc_weights(returns, risk_weights=risk_weights)
171
print("Custom Risk Target Weights:")
172
print(custom_erc_weights.round(4))
173
174
# Verify risk contributions
175
portfolio_returns = (returns * custom_erc_weights).sum(axis=1)
176
print(f"\nPortfolio Sharpe: {ffn.calc_sharpe(portfolio_returns, rf=0.02):.3f}")
177
```
178
179
### Weight Constraint Management
180
181
```python
182
import ffn
183
import numpy as np
184
185
# Generate optimization scenario
186
assets = ['AAPL', 'MSFT', 'GOOGL', 'AMZN', 'NFLX', 'NVDA', 'META', 'TSLA']
187
prices = ffn.get(assets, start='2020-01-01')
188
returns = ffn.to_returns(prices).dropna()
189
190
# Unconstrained mean-variance optimization
191
unconstrained_weights = ffn.calc_mean_var_weights(returns, rf=0.02)
192
print("Unconstrained Weights:")
193
print(unconstrained_weights.round(4))
194
print(f"Max weight: {unconstrained_weights.max():.3f}")
195
196
# Apply progressive weight limits
197
for limit in [0.4, 0.3, 0.2, 0.15]:
198
limited = ffn.limit_weights(unconstrained_weights, limit=limit)
199
print(f"\nMax {limit*100:.0f}% Weight Limit:")
200
print(f"Largest weight: {limited.max():.3f}")
201
print(f"Number of assets > 5%: {(limited > 0.05).sum()}")
202
203
# Compare portfolio performance across constraints
204
results = {}
205
for limit in [None, 0.4, 0.3, 0.2, 0.15]:
206
if limit is None:
207
weights = unconstrained_weights
208
label = 'Unconstrained'
209
else:
210
weights = ffn.limit_weights(unconstrained_weights, limit=limit)
211
label = f'{limit*100:.0f}% Limit'
212
213
port_returns = (returns * weights).sum(axis=1)
214
results[label] = {
215
'Sharpe': ffn.calc_sharpe(port_returns, rf=0.02),
216
'Volatility': port_returns.std() * (252**0.5),
217
'Max Weight': weights.max()
218
}
219
220
constraint_df = pd.DataFrame(results).T
221
print("\nConstraint Impact Analysis:")
222
print(constraint_df.round(3))
223
```
224
225
### Advanced Optimization
226
227
```python
228
import ffn
229
import pandas as pd
230
231
# Multi-period optimization example
232
prices = ffn.get('SPY,QQQ,IWM,EFA,EEM,TLT,GLD', start='2010-01-01')
233
returns = ffn.to_returns(prices).dropna()
234
235
# Rolling optimization (rebalancing quarterly)
236
lookback_days = 252 * 2 # 2 years of data
237
rebalance_freq = 63 # Quarterly (approx 63 trading days)
238
239
portfolio_performance = []
240
rebalance_dates = returns.index[lookback_days::rebalance_freq]
241
242
for rebal_date in rebalance_dates[:8]: # Limited example
243
# Get historical data for optimization
244
end_idx = returns.index.get_loc(rebal_date)
245
start_idx = end_idx - lookbook_days
246
hist_returns = returns.iloc[start_idx:end_idx]
247
248
# Calculate optimal weights
249
weights = ffn.calc_mean_var_weights(hist_returns, rf=0.02)
250
251
# Apply to next period
252
next_period_start = end_idx
253
next_period_end = min(end_idx + rebalance_freq, len(returns))
254
next_returns = returns.iloc[next_period_start:next_period_end]
255
256
# Portfolio returns for this period
257
port_returns = (next_returns * weights).sum(axis=1)
258
259
portfolio_performance.extend(port_returns.tolist())
260
261
print(f"Rebalance {rebal_date.strftime('%Y-%m-%d')}: Sharpe = {ffn.calc_sharpe(port_returns, rf=0.02):.3f}")
262
263
# Convert to series and analyze
264
portfolio_series = pd.Series(portfolio_performance,
265
index=returns.index[lookback_days:lookback_days+len(portfolio_performance)])
266
267
print(f"\nOverall Rolling Portfolio Sharpe: {ffn.calc_sharpe(portfolio_series, rf=0.02):.3f}")
268
```