I will divide the answer into three sections
A) simplest case : independent stationary variables
B) few words on time-series models
C) stationary correlated variables
A) We will employ the simplest statistical model:
Value describing each component (a fund) is a
random variable with normal distribution (described by mean and sigma).
Then, the theorem which provides the answer is here:
if X and Y are independent random variables that are normally
distributed, then X + Y is also normally distributed
You can skip the proof (which is here)
and general background on normal (= Gaussian or 'bell-like curve)
distribution (which is here)
and just use the result: means are additive, and variances (not
sigmas) are additive
meaning: for N mutual funds (components) (lets say N=2) we have
fund 1 has mean.1 and sigma.1 (average and standard deviation)
fund 2 has mean.2 and sigma.2 (average and standard deviation)
for a particular property (such as price, or dividend, current market
value, or ROI ...)
Let's pick an additive property (such as current market value = V)
Than, when we compose a portfolio, property of the total is
V.t = w.1 * V.1 + w.2 * V.2
(market value of the portfolio IS Sum of the Values of the components,
That is not so for all properties true, e.g. not for ROI)
-------------- mwaning of weights w
Here V1 can be 100 shares of Goog , V2 1000 shares of GERN ...
if weight w.1 = 2 and w.2=5, then the composite (portfolio) would have
200 shares of Goog and 5000 shares of Gern , OK?
So, if V1 and V2 are Gaussian with parameters shown above,
then V.t will be Gaussian with parameters
mean.t= w.1 * mean.1 + w.2 * mean.2
and std. deviation sigma.t defined by
sigma.t ^2 = (w.1 * sigma.1) ^2 + (w.2 * sigma.2) ^2
where ^ means power: 3^2 = 9, 2^3 =8, ...
B) When you say (in your clarification)
"standard deviation of returns is for the portfolio after N periods .."
You are using symbol N in a second meaning.
Let's talk about N funds, which we will keep for M time-periods (e.g.
weeks, or months..)
i = 1, 2, ... N and time l=1,2, .... M
The simplest assumption is that the values V.i.l (value of dund i at time l)
and independent from each other and of their past.
In such a case, Values (V.1.1 , V.1.2, V.1.3 ) for find i=1 are just
samples taken from the same
distribution (N(mean.1, sigma.1), and so on for other i's.
That assumes the fundamentals did not change over time and proces have no memory.
That is a gross oversimplification. The commonly accepted model a
'random walk ' (aka Markov chain)
in which prices do not have long term memory, but the next period
price (V(t+1) depends on previous price V(t).
The 'Nobel prize- winning' application based on this model (applied to
options) is described here
Books are available with details and generalizations, but the formulas
are too complex for a spread sheet.
Search Term: Black_sholes
Popular expose of the model: ( a classic by Burton Malkiel )
C) We will stay with memory-less model and just remove the assumption
We have N time-series V.i (i =1 , 2 ... N) which we treat is M
samples from N distributions.
We construct composite time series V.t as before
We calculate 'expectation value' of square M2= <V.t * V.t> and get
M2= sum over i Sum over j < V.i * V.j >
We need to understand relation of M2, variance and sigma, defined here:
and correlation coefficients r.i.j defined here
and in (too many) details, here
and a bit more general way - as a Cross Correlation here
For our purposes:
M2 = second moment
Variance is M2 - mean ^2
sigma is variance^.5 (square root)
Inserting cross-correlations s.i.j and sigmas.i into the above expression,
we obtain formula for means of the sigma.t in terms of sigmas and r's.
mean.t is same as in the case A).
If all cross-correlations s.i.j (for i != j) are zero, then this
formula is reduced to the
previous (independent variables) case. (Here the symbol != means 'not equal' ).
I did try to make it as simple as possible, but there is always a
space for improvement:
Please do ask for clarification (RFC) if what I wrote is nor clear.
After all is clear, rating is appreciated.