Financial Markets Shannon Capacity

From Wikipedia, the free encyclopedia
In information theory, the Shannon–Hartley theorem is an application of the noisy channel coding theorem to the archetypal case of a continuous-time analog communications channel subject to Gaussian noise. The theorem establishes Shannon's channel capacity for such a communication link, a bound on the maximum amount of error-free digital data (that is, information) that can be transmitted with a specified bandwidth in the presence of the noise interference, under the assumption that the signal power is bounded and the Gaussian noise process is characterized by a known power or power spectral density. The law is named after Claude Shannon and Ralph Hartley.
Claude Shannon's development of information theory during World War II provided the next big step in understanding how much information could be reliably communicated through noisy channels. Building on Hartley's foundation, Shannon's noisy channel coding theorem (1948) describes the maximum possible efficiency of error-correcting methods versus levels of noise interference and data corruption.
So how do bandwidth and noise affect the rate at which information can be transmitted over an analog channel?
Similarly how do investors capacity and noise affect affect the rate at which economical information can be shared between various investors?
Hence because of intrinsec uncertainty in the market, investors should hold portfolios as diversified as possible, i.e portfolios staying far from the boundaries of the unit simplex. In other words portfolios should remain as close as possible from the equi-weighted portfolio 1/N, i.e the portfolio that maximizes the Shannon entropy H(w1,...,wn).

Comments

Popular Posts