Exactly, this is included in my reasoning. There are two competing elements. There is the decrease in the arithmetic average of the returns on one side, and the decrease in the volatility on the other. If you decrease too much the average of returns, of course then reducing the volatility won't help you. So, my educated guess was: if what is happening is that you're decreasing the volatility more strongly than you're decreasing the average of the returns -I don't know this for sure, but at least theoretically it's totally possible- then there will be a certain optimal allocation for which what you gain by decreasing the volatility wins over what you lose by decreasing the average of the returns.Marseille07 wrote: ↑Mon May 10, 2021 2:39 pm Except that this only applies when you're comparing apples to apples. When we're comparing 100/0 vs 90/10, let's say 1M each, we're comparing 1M of equities vs 900K of equities plus 100K in bonds.
Tell me again how 900K's return is greater than 1M on the same volatility on the equities portion?
Vineviz told me that unfortunately the difference in the expected returns between bonds and stocks is too large for this effect to take place, and all I said was "show me".
Alright, fair enough. Thank you.willthrill81 wrote: ↑Mon May 10, 2021 2:47 pm I used the Simba backtesting spreadsheet to create the chart below, which shows the internal rate of return of 100/0 allocations vs. 90/10 going back to 1871. From 1904-1910, the 30 year IRR was better for 90/10 than 100/0, but 100/0 had a higher IRR in nearly every other 30 year period.