Chaos, complexity and entropy [for non-physicists]
-
- Posts: 9242
- Joined: Sun Dec 26, 2010 11:47 am
Chaos, complexity and entropy [for non-physicists]
I have often referred to these concepts in this forum as a counter-argument to those who believe in the value of economic models based on limited past data regarding long-range future prediction of success.
I stumbled upon this excellent, simplified discussion of these phenomena entitled "Chaos, Complexity, and Entropy: A physics talk for non-physicists", by Michael Baranger of MIT: http://necsi.edu/projects/baranger/cce.pdf
I recommend it for those not very familiar with the subject...it is clear and concise, and may forever change the way you view prediction, investing, the economy, life, the universe and everything. It is probably of little practical value regarding maudlin practical decision-making (eg asset allocation, whether to buy or rent, etc). But grasping and internalizing it is something I find not only fascinating, but life-altering in the way we view our decisions, and in a strange way, very liberating. You can read Wade Pfau and still laugh a little.
Plus, for those who still remember some of the physics they studied in high school, the relationship between entropy, chaos , information and perception is beyond fascinating.
(Victoria will particularly appreciate the mention of Jane Austen).
I stumbled upon this excellent, simplified discussion of these phenomena entitled "Chaos, Complexity, and Entropy: A physics talk for non-physicists", by Michael Baranger of MIT: http://necsi.edu/projects/baranger/cce.pdf
I recommend it for those not very familiar with the subject...it is clear and concise, and may forever change the way you view prediction, investing, the economy, life, the universe and everything. It is probably of little practical value regarding maudlin practical decision-making (eg asset allocation, whether to buy or rent, etc). But grasping and internalizing it is something I find not only fascinating, but life-altering in the way we view our decisions, and in a strange way, very liberating. You can read Wade Pfau and still laugh a little.
Plus, for those who still remember some of the physics they studied in high school, the relationship between entropy, chaos , information and perception is beyond fascinating.
(Victoria will particularly appreciate the mention of Jane Austen).
Re: Chaos, complexity and entropy simplified
Thanks, I found this very interesting
- whaleknives
- Posts: 1238
- Joined: Sun Jun 24, 2012 7:19 pm
Re: Chaos, complexity and entropy simplified
I'm sorry, but entropy is a physical property, like temperature and pressure, useful for describing power cycles using steam and gas turbines. You might as well fantasize about heat of vaporization.
"I'm an indexer. I own the market. And I'm happy." (John Bogle, "BusinessWeek", 8/17/07) ☕ Maritime signal flag W - Whiskey: "I require medical assistance."
Re: Chaos, complexity and entropy simplified
The term is also used in Information Theory to represent the amount of information that is missing, that's how it's used in the paperwhaleknives wrote:I'm sorry, but entropy is a physical property, like temperature and pressure, useful for describing power cycles using steam and gas turbines. You might as well fantasize about heat of vaporization.
See https://en.wikipedia.org/wiki/Entropy#I ... ion_theory for more
-
- Posts: 9242
- Joined: Sun Dec 26, 2010 11:47 am
Re: Chaos, complexity and entropy simplified
Exactly, coyote. And, though I am no expert, from my readings, the 21st century trend is towards considering "information" more fundamental than concepts such as matter or energy. The article was written by a physicist, not by a computer scientist or a psychologist . To my knowledge there are not different types of entropy. Entropy has not changed...it is being seen by physicists (or at least those with anything approaching deep understanding of the bizarre concept) in more profound ways than we learned in high school. I doubt that a meaningful distinction exists between "informational entropy" and "physical entropy".
Anyway, the entropy part of the work is more general food for thought. From a practical standpoint regarding economic assumptions, forecasting, modeling etc, the part about chaos and complexity is highly relevant re: many threads in this forum.
Anyway, the entropy part of the work is more general food for thought. From a practical standpoint regarding economic assumptions, forecasting, modeling etc, the part about chaos and complexity is highly relevant re: many threads in this forum.
- whaleknives
- Posts: 1238
- Joined: Sun Jun 24, 2012 7:19 pm
Re: Chaos, complexity and entropy simplified
That might be the consensus of the basket weavers, but it won't keep your lights on.protagonist wrote:. . . the 21st century trend is towards considering "information" more fundamental than concepts such as matter or energy.
"I'm an indexer. I own the market. And I'm happy." (John Bogle, "BusinessWeek", 8/17/07) ☕ Maritime signal flag W - Whiskey: "I require medical assistance."
Re: Chaos, complexity and entropy simplified
You can't make this up. The phrase "physics envy" comes to mind.Coyote wrote:The term is also used in Information Theory to represent the amount of information that is missing, that's how it's used in the paperwhaleknives wrote:I'm sorry, but entropy is a physical property, like temperature and pressure, useful for describing power cycles using steam and gas turbines. You might as well fantasize about heat of vaporization.
See https://en.wikipedia.org/wiki/Entropy#I ... ion_theory for more
http://en.wikipedia.org/wiki/Physics_envy
L.I went walking in the wasted city
Started thinking about entropy
Warren Zevon - "Run Straight Down"
You can get what you want, or you can just get old. (Billy Joel, "Vienna")
- nisiprius
- Advisory Board
- Posts: 52107
- Joined: Thu Jul 26, 2007 9:33 am
- Location: The terrestrial, globular, planetary hunk of matter, flattened at the poles, is my abode.--O. Henry
Re: Chaos, complexity and entropy simplified
I'm about 2/3 of the way through the paper and I think it is excellent, excellent, excellent. And it fits very nicely with what I perceived and experienced. The word "obscene" literally means something so awful that you cannot bear to look directly at it. One of the things I really kick myself for is that in let's say the 1960s-80s I saw all sort of things that fell under the general heading of "chaos" and "complexity" but treated them as obscene--I ignored them, I looked away from them because I regarded them as wrong, or broken, or outside the range of things about which you could ask useful questions. It never occurred to me that they might be a more legitimate and more interesting object of regard than the kingdom of linearity.
For example, in the 1960s quality oscilloscopes like Tektronix had clean sweep trigger circuits, but cheap ones didn't--and television sets didn't. They used a very strange pragmatic kind of synchronization, in which the sawtooth oscillator circuit's point at which the sweep would and and it would retrace was simply modulated by the signal. There was a gain control on the sync signal. A lot of the time it would sorta-kinda work if the signal itself was fairly well behaved, but often it wouldn't and you would get what was, in fact, chaotic sweep behavior. And instead of saying "cool! let's look at that" you'd just saw "aw, cheap-and-cheesy sync circuit doesn't work properly" and you'd fiddle with the gain until it did.
When nonlinearity was studied, it was gentle nonlinearity--forming sum and difference frequencies, for example. The kind of nonlinearity you get when you turn up the gain until the sine-wave oscillator starts squawking irregularly like a squeaky hinge... that was not fair, that was beyond the pale.
As an early minicomputer hacker, we enjoyed writing "display hacks" which were short programs that produced interesting displays... and we knew that a good starting point for them was to start with some nice, well-behaved analog algorithm--like a circle-drawing algorithm--and then throw in come crazy digital nonlinearity, like an exclusive-or where there should have been a subtraction. If you were lucky, the algorithm would then go nuts producing fascinating "unpredictable" dot patterns and textures. We thought this was purely entertaining, and slightly improper--it was torturing a nice algorithm, when we should have been trying to make it more numerically precise.
I'm not sure where it fits into Barringer's story but there was a sort of fad or belief system or faith that the behavior of complex real-world systems were not really that complicated, and that if you could just identify the major feedback loops and their topology, the behavior of the system would emerge--more or less independently, or at least with very little sensitivity, to actual parameter values. I don't know if this was an outgrowth of successful World War II analog servotechologies like radar-guided antiaircraft. At any, Jay Forrester in particular was an influential exponent, and wrote a series of books, "Industrial Dynamics," "Urban Dynamics," and most famously "World Dynamics" (which led to the Club of Rome's "Limits to Growth" model), in which real-world behavior was supposed to be modeled by systems with only a few dozen variables and feedback loops. I believe this was actually the opposite of "chaos" and "complexity." I don't think Forrester's models exhibited chaotic behavior at all. But I don't know why not.
I don't have enough background in either information theory or thermo to explain the reason why information entropy IS the same as thermodynamics entropy, but it is. Shannon was not a fool, did not suffer from physics envy, and was not misappropriating a term. Wikipedia explains it though I don't understand the explanation. I seem to recall that one of the resolutions of the Maxwell's Demon problem involves some unbelievably clever reasoning about the ability of the demon to acquire information. Because the signal has to be strong enough to overcome the noise from the background heat, the demon requires energy to get the information about which molecules are moving quickly, and the cost of that energy turns out to exactly equal the energy "created" by opening and shutting the door... or something.
For example, in the 1960s quality oscilloscopes like Tektronix had clean sweep trigger circuits, but cheap ones didn't--and television sets didn't. They used a very strange pragmatic kind of synchronization, in which the sawtooth oscillator circuit's point at which the sweep would and and it would retrace was simply modulated by the signal. There was a gain control on the sync signal. A lot of the time it would sorta-kinda work if the signal itself was fairly well behaved, but often it wouldn't and you would get what was, in fact, chaotic sweep behavior. And instead of saying "cool! let's look at that" you'd just saw "aw, cheap-and-cheesy sync circuit doesn't work properly" and you'd fiddle with the gain until it did.
When nonlinearity was studied, it was gentle nonlinearity--forming sum and difference frequencies, for example. The kind of nonlinearity you get when you turn up the gain until the sine-wave oscillator starts squawking irregularly like a squeaky hinge... that was not fair, that was beyond the pale.
As an early minicomputer hacker, we enjoyed writing "display hacks" which were short programs that produced interesting displays... and we knew that a good starting point for them was to start with some nice, well-behaved analog algorithm--like a circle-drawing algorithm--and then throw in come crazy digital nonlinearity, like an exclusive-or where there should have been a subtraction. If you were lucky, the algorithm would then go nuts producing fascinating "unpredictable" dot patterns and textures. We thought this was purely entertaining, and slightly improper--it was torturing a nice algorithm, when we should have been trying to make it more numerically precise.
I'm not sure where it fits into Barringer's story but there was a sort of fad or belief system or faith that the behavior of complex real-world systems were not really that complicated, and that if you could just identify the major feedback loops and their topology, the behavior of the system would emerge--more or less independently, or at least with very little sensitivity, to actual parameter values. I don't know if this was an outgrowth of successful World War II analog servotechologies like radar-guided antiaircraft. At any, Jay Forrester in particular was an influential exponent, and wrote a series of books, "Industrial Dynamics," "Urban Dynamics," and most famously "World Dynamics" (which led to the Club of Rome's "Limits to Growth" model), in which real-world behavior was supposed to be modeled by systems with only a few dozen variables and feedback loops. I believe this was actually the opposite of "chaos" and "complexity." I don't think Forrester's models exhibited chaotic behavior at all. But I don't know why not.
I don't have enough background in either information theory or thermo to explain the reason why information entropy IS the same as thermodynamics entropy, but it is. Shannon was not a fool, did not suffer from physics envy, and was not misappropriating a term. Wikipedia explains it though I don't understand the explanation. I seem to recall that one of the resolutions of the Maxwell's Demon problem involves some unbelievably clever reasoning about the ability of the demon to acquire information. Because the signal has to be strong enough to overcome the noise from the background heat, the demon requires energy to get the information about which molecules are moving quickly, and the cost of that energy turns out to exactly equal the energy "created" by opening and shutting the door... or something.
Annual income twenty pounds, annual expenditure nineteen nineteen and six, result happiness; Annual income twenty pounds, annual expenditure twenty pounds ought and six, result misery.
Re: Chaos, complexity and entropy simplified
So Nisi, what is your 'take away' from this? In short?
Re: Chaos, complexity and entropy simplified
Nisiprius,
And the information theory of electronic signals applies to economic data, how?
L.
And the information theory of electronic signals applies to economic data, how?
L.
Last edited by Leeraar on Tue Jan 27, 2015 6:18 pm, edited 1 time in total.
You can get what you want, or you can just get old. (Billy Joel, "Vienna")
- FreeAtLast
- Posts: 802
- Joined: Tue Nov 04, 2014 8:08 pm
Re: Chaos, complexity and entropy simplified
From statistical thermodynamics, entropy is a measure of disorder in a closed physical system, often indicated in an equation by S = k ln W.....where S is entropy, k is Boltzmann's constant and W is the partition function.....entropy is irreversible and always increases....it is quantification of the inefficiency of a power cycle (often named after the Frenchman Sadie Carnot), ie, it equals the lost heat that you cannot make to do useful work (and that is what a year of physical chemistry will do to your cerebrum!)
I believe Claude Shannon was trying to quantify the minimal amount of useful (understandable) information in data bits you could obtain through a phone connection.....and that he used the negative of entropy to represent it....we need a telephone engineer to speak up here.
I believe Claude Shannon was trying to quantify the minimal amount of useful (understandable) information in data bits you could obtain through a phone connection.....and that he used the negative of entropy to represent it....we need a telephone engineer to speak up here.
Illegitimi non carborundum.
Re: Chaos, complexity and entropy simplified
The partition function already has a negative sign in the exponent. Up to a scaling constant, Boltzmann-Gibbs entropy from physics is the same as Shannon entropy from information theory/statistics. Both quantify the effective number of states explored by a system, provided the probabilities of the states.FreeAtLast wrote:From statistical thermodynamics, entropy is a measure of disorder in a closed physical system, often indicated in an equation by S = k ln W.....where S is entropy, k is Boltzmann's constant and W is the partition function.....entropy is irreversible and always increases....it is quantification of the inefficiency of a power cycle (often named after the Frenchman Sadie Carnot), ie, it equals the lost heat that you cannot make to do useful work (and that is what a year of physical chemistry will do to your cerebrum!)
I believe Claude Shannon was trying to quantify the minimal amount of useful (understandable) information in data bits you could obtain through a phone connection.....and that he used the negative of entropy to represent it....we need a telephone engineer to speak up here.
Re: Chaos, complexity and entropy simplified
protagonist,
Thanks for the great article!
TJSI
Thanks for the great article!
TJSI
- dbCooperAir
- Posts: 1107
- Joined: Tue Jan 07, 2014 9:13 pm
Re: Chaos, complexity and entropy simplified
I had to tag this post so I could come back and read the paper latter, thanks.
Neither a wise man nor a brave man lies down on the tracks of history to wait for the train of the future to run over him. |
-Dwight D. Eisenhower-
Chaos, complexity and entropy [for non-physicists]
Great article.....and an even more enlightening discussion by my colleagues.
Thanks to all
Shawcroft
Thanks to all
Shawcroft
Re: Chaos, complexity and entropy [for non-physicists]
Here's the relevance of fractals to investing: How Fractals Can Explain What's Wrong with Wall Street - note the author.
(From this forum thread: What Fund Investors Should Learn from Benoit Mandelbrot (Oct 28, 2010))
(From this forum thread: What Fund Investors Should Learn from Benoit Mandelbrot (Oct 28, 2010))
-
- Posts: 536
- Joined: Tue Nov 25, 2008 2:34 pm
Re: Chaos, complexity and entropy [for non-physicists]
I'm hooked after reading only the first 3 pages. I'll complete the article soon.
Thanks a lot.
Thanks a lot.
The finest, albeit the most difficult, of all human achievements is being reasonable.
Re: Chaos, complexity and entropy simplified
As the paper says, the Second Law is subjective (a better term might be "phenomenological," like someone mentioned above). This means it's observer dependent. It says that "processing" a system by the standard laws of physics does not decrease our ignorance about its state. At the end of the day it's an information processing inequality, which is mathematically true, but as information measures have meaning due to the law of large numbers, it's also true as a statistical statement (asymptotically in time, almost surely, other caveats, etc.). We're also placing ourselves outside the system (i.e. closed system), so we assume whatever knowledge we "forget" about the system's initial state doesn't count as any loss of entropy (information) to the system. Convenient!nisiprius wrote:I seem to recall that one of the resolutions of the Maxwell's Demon problem involves some unbelievably clever reasoning about the ability of the demon to acquire information. Because the signal has to be strong enough to overcome the noise from the background heat, the demon requires energy to get the information about which molecules are moving quickly, and the cost of that energy turns out to exactly equal the energy "created" by opening and shutting the door... or something.
Maxwell's demon proposes to reverse the system by undoing the processing (running physics backwards). This is definitely possible (physics is reversible), and from the demon's perspective, the entropy of the system indeed decreases. Nothing wrong with that because the demon is acquiring information about the system with each measurement, collapsing its states. But now that entropy (information) is transferred out, we don't consider the system as closed any more. Again, convenient! Then we apply the information processing inequality to the whole system including the demon and recover the Second Law. We then "conclude" that any ignorance an outside observer sheds about the original system is transferred to the demon's internal state. The exact mechanism is not that important, and not really known; some say it's by the demon's computation, writing to memory, doing work, or whatever.
The trouble is physicists by choice only allow the possibility of a forgetting observer to be outside a closed system. They don't allow a learning observer to be, because learning requires actually constructing some states. Forgetting, well, who knows what you did know, before you forgot. Maybe you didn't know anything. That is the entirety of the Second Law, IMHO.
Re: Chaos, complexity and entropy [for non-physicists]
After reading this paper:
It seems that there are known Unknowns, and unknown unknowns.
It seems that there are known Unknowns, and unknown unknowns.
Re: Chaos, complexity and entropy [for non-physicists]
protagonist, thanks for posting, interesting and thought provoking. For those old enough to remember, the following sentence trigger a memory of Hari Seldon in Isaac Asimov's Foundation Trilogy.
I'll have to come back later and read how Mandelbrot can explain stock prices.But there is fairly complete agreement that the “ideal” complex systems, those which we would like most to understand, are the biological ones, and especially the systems having to do with people: our bodies, our groupings, our society, our culture
"The greatest enemy of a good plan is the dream of a perfect plan" - Carl Von Clausewitz
- Epsilon Delta
- Posts: 8090
- Joined: Thu Apr 28, 2011 7:00 pm
Re: Chaos, complexity and entropy simplified
The second law* was an empirical statement derived from observation of steam engines. It helped to explain how they worked. It was later observed that you could make the same sort of statement about many other systems. Inductive reasoning leaped to a universal law. The theory is trying to explain why, but even without any theory you'd still have the second law as a summation of the results of a massive number of experiments.zeugmite wrote:The trouble is physicists by choice only allow the possibility of a forgetting observer to be outside a closed system. They don't allow a learning observer to be, because learning requires actually constructing some states. Forgetting, well, who knows what you did know, before you forgot. Maybe you didn't know anything. That is the entirety of the Second Law, IMHO.
*It actually predates the first law.
Re: Chaos, complexity and entropy [for non-physicists]
"The Second Law" by P.W. Atkins (1984) talks about entropy, chaos and complexity. Further reading, if you enjoyed the paper linked in the OP.
L.
L.
You can get what you want, or you can just get old. (Billy Joel, "Vienna")
Re: Chaos, complexity and entropy [for non-physicists]
I wonder if people realize that if the stock market is a chaotic system, that suggests it *is* predictable in the short term. I think there is more evidence for the market being "stochastic" than "chaotic". Chaotic systems are deterministic. Quoting the Wikipedia on chaos theory:
Distinguishing random from chaotic data
It can be difficult to tell from data whether a physical or other observed process is random or chaotic, because in practice no time series consists of a pure "signal". There will always be some form of corrupting noise, even if it is present as round-off or truncation error. Thus any real time series, even if mostly deterministic, will contain some randomness.
All methods for distinguishing deterministic and stochastic processes rely on the fact that a deterministic system always evolves in the same way from a given starting point.Thus, given a time series to test for determinism, one can
1. pick a test state;
2. search the time series for a similar or nearby state; and
3. compare their respective time evolutions.
Define the error as the difference between the time evolution of the test state and the time evolution of the nearby state. A deterministic system will have an error that either remains small (stable, regular solution) or increases exponentially with time (chaos). A stochastic system will have a randomly distributed error.
-
- Posts: 9242
- Joined: Sun Dec 26, 2010 11:47 am
Re: Chaos, complexity and entropy [for non-physicists]
From where do you draw this conclusion? If this were true, it would also be predictable in the long run and it would not be chaotic. The savvy investor would just revise his predictions daily, or every minute, since he could reliably predict whether the market would rise or fall tomorrow or in the next minute, and he could act on that prediction. By short-term trading he would never lose.Beliavsky wrote:I wonder if people realize that if the stock market is a chaotic system, that suggests it *is* predictable in the short term.
I think you misunderstand the concepts of chaos and complexity. I think many people do, which is why I suggested this article. It's a good one.
By the way, I am really enjoying reading the entropy discussion.
Last edited by protagonist on Wed Jan 28, 2015 11:29 am, edited 1 time in total.
Re: Chaos, complexity and entropy simplified
One correction - entropy is always increasing in a closed system. That is a very important distinction. Entropy can, and does, decrease in a non-closed system. It is critical to identify the system that is being modeled. Lay people who try to apply entropy laws to make pronouncements about the rise of order on our planet fail to do this.FreeAtLast wrote:From statistical thermodynamics, entropy is a measure of disorder in a closed physical system, often indicated in an equation by S = k ln W.....where S is entropy, k is Boltzmann's constant and W is the partition function.....entropy is irreversible and always increases....it is quantification of the inefficiency of a power cycle (often named after the Frenchman Sadie Carnot), ie, it equals the lost heat that you cannot make to do useful work (and that is what a year of physical chemistry will do to your cerebrum!)
I believe Claude Shannon was trying to quantify the minimal amount of useful (understandable) information in data bits you could obtain through a phone connection.....and that he used the negative of entropy to represent it....we need a telephone engineer to speak up here.
I have always thought the second law of thermodynamics had wide applicability beyond thermal system. Basically, the idea that it takes energy to derive useful work is a powerful concept. However, the entropy defined by Boltzmann in his statistical mechanics is specific to a set of conditions that work well for atomic particles but may not have any analog in finance. In particular, the notion of a large number of discrete states with statistical independence and equal probabilities does not seem to me to describe investing or investors.
Boltzmann was always my favorite physicist when I was in the game. His theory is a brilliant piece of work that was extremely bold and controversial in his day. He was widely ridiculed over his theory that was one of the first to make use of the atomic nature of matter. He eventually committed suicide and it wasn't until after his death that statistical mechanics became widely accepted as the theory which bridged thermodynamics with atomic theories.
Kolea (pron. ko-lay-uh). Golden plover.
- FreeAtLast
- Posts: 802
- Joined: Tue Nov 04, 2014 8:08 pm
Re: Chaos, complexity and entropy [for non-physicists]
The concept of irreversible entropy can be illustrated by a simple thought experiment: Take two glass bulbs of equal volume (denote them by A and B). They are connected by a glass tube with a closed stopcock. Each bulb contains 20 grams of Argon gas. The air temperature outside the bulbs is kept at a constant 32 degrees Fahrenheit. Well-known laws of physics tell us that each bulb contains approximately 300 billion trillion atoms of Argon. Open the stopcock. Even if you have never taken a science course in your life.....look carefully at the symmetry of the situation......does it not seem reasonable - all things being equal - that for every atom of Argon that travels from A into B.....that in a very short period of time another atom of Argon will travel from B into A? Could you reasonably imagine a scenario where 300 billion trillion atoms from A would all travel into B.....while all the Argon atoms in B would stay put in B? The latter scenario is what a complete reversal of entropy would be......and, while I am not going to try to do the calculation......even a temporary imbalance between A and B of 1,000,000 atoms would be extraordinarily improbable.
TwoByFour: I agree - I should have made it clearer that I was always referring to a closed system.
TwoByFour: I agree - I should have made it clearer that I was always referring to a closed system.
Illegitimi non carborundum.
Re: Chaos, complexity and entropy [for non-physicists]
No -- you misunderstand what chaos means. I suggest reading the Wikipedia article. The weather can be thought of as a chaotic system. Short-term forecasts that are better than random can be made, but long-term forecasts are no better than those based on seasonality.protagonist wrote:From where do you draw this conclusion? If this were true, it would also be predictable in the long run and it would not be chaotic.Beliavsky wrote:I wonder if people realize that if the stock market is a chaotic system, that suggests it *is* predictable in the short term.
Re: Chaos, complexity and entropy [for non-physicists]
Well, that is only a definition. I am not sure it means anything in practical terms. Weather is thought of as being chaotic but it is far from being deterministic in a practical (and accurate) sense. If you ever read the daily analysis done by NOAA, they typically have the results of three massive computer simulations, each of which is often different. The analyst picks the computer run that from his/her experience is the right one and that becomes the forecast. It is quite ad hoc.Beliavsky wrote:I wonder if people realize that if the stock market is a chaotic system, that suggests it *is* predictable in the short term. I think there is more evidence for the market being "stochastic" than "chaotic". Chaotic systems are deterministic.
Kolea (pron. ko-lay-uh). Golden plover.
-
- Posts: 9242
- Joined: Sun Dec 26, 2010 11:47 am
Re: Chaos, complexity and entropy [for non-physicists]
I never suggested that the market behaves randomly. Yes, the chance of massive fluctuation in the market as a whole in the next second is very low, which is incompatible if the market responded randomly but expected of a complex system whose behavior emerges from its several, multilayered components. This also makes the system robust. The parallel between the market and the weather is an accurate one. Both are emergent systems that are exquisitely sensitive to initial conditions. Divergence of probabilities from expected results based on linear (reductionist) models is exponential with relation to time. Except, as you said, there are fairly reliable "seasonal" variations in weather based on well-understood astrophysical principles, whereas an analog in the world of finance is lacking. This is one reason why long-term financial modeling doesn't work.Beliavsky wrote:No -- you misunderstand what chaos means. I suggest reading the Wikipedia article. The weather can be thought of as a chaotic system. Short-term forecasts that are better than random can be made, but long-term forecasts are no better than those based on seasonality.protagonist wrote:From where do you draw this conclusion? If this were true, it would also be predictable in the long run and it would not be chaotic.Beliavsky wrote:I wonder if people realize that if the stock market is a chaotic system, that suggests it *is* predictable in the short term.
I am surprised you are drawing this (accurate) parallel with long-term weather forecasting, since I was of the understanding that you were a firm believer in the validity of long-term financial modeling. Perhaps I am wrong about that.
Last edited by protagonist on Wed Jan 28, 2015 2:49 pm, edited 1 time in total.
Re: Chaos, complexity and entropy simplified
This is another fascinating buddy of the Second Law, that it takes free energy (first year physics calls it "high quality energy") and not just "any" energy like heat to do mechanical work. What is free energy you ask? Well it's defined circularly as energy available to do thermodynamic work. At the core of it though is the distinction of information. Recall the Maxwell's demon sits at a trapdoor and measures the gas particles. If you, like Maxwell's demon, acquired knowledge about the entire current state of the system (position and momentum of every particle to start with), you would be able to extract all energy from the system, by setting up tiny pistons at the appropriate angles, etc. Or do like the demon does, get the gases into one side so that there is a macroscopic state known to even the macroscopic observer, who can use it to do work. But without the demon, a macroscopic observer is ignorant about almost the entire state of the system (only knows temperature, volume, pressure), so for that observer, free energy means only stuff you can do with temperature reservoirs, expanding gases, etc. Free energy and what is useful energy again depend on the observer. It's all about information.TwoByFour wrote:I have always thought the second law of thermodynamics had wide applicability beyond thermal system. Basically, the idea that it takes energy to derive useful work is a powerful concept.
Re: Chaos, complexity and entropy [for non-physicists]
There is a very important assumption that always gets left out. This is highly improbable if the initial state of the system is randomly drawn from all the possibilities not ruled out by our knowledge. Our knowledge is fairly deficient, so again we only know the gases are initially mixed at some temperature/volume but we don't know where the particles are or where they are going. If they began in a very special state, such as in the state just mirroring the state they end up in starting from a not-mixed state, then physics says everything runs in reverse and they end up not mixed (for a moment anyway).FreeAtLast wrote:Could you reasonably imagine a scenario where 300 billion trillion atoms from A would all travel into B.....while all the Argon atoms in B would stay put in B? The latter scenario is what a complete reversal of entropy would be......and, while I am not going to try to do the calculation......even a temporary imbalance between A and B of 1,000,000 atoms would be extraordinarily improbable.
What is to say that the special state I described is more unlikely than the special state of not-mixed gases? Nothing. They are one-to-one, so the same probability. It is only that we can't tell in the first case, but we can in the second case that makes the difference. Observer dependent.
-
- Posts: 64
- Joined: Mon Jun 02, 2014 10:12 am
Re: Chaos, complexity and entropy [for non-physicists]
Benoit B. Mandelbrot.
The B stands for Benoit B. Mandelbrot.
The B stands for Benoit B. Mandelbrot.
- Epsilon Delta
- Posts: 8090
- Joined: Thu Apr 28, 2011 7:00 pm
Re: Chaos, complexity and entropy [for non-physicists]
The bold bit is not true. An imbalance of 1,000,000 atoms turns out to be a 0.0000013 sigma event, colloquially known as darn near certain. The issue is that when you have almost a septillion atoms, a million is a not a lot.FreeAtLast wrote:The concept of irreversible entropy can be illustrated by a simple thought experiment: Take two glass bulbs of equal volume (denote them by A and B). They are connected by a glass tube with a closed stopcock. Each bulb contains 20 grams of Argon gas. The air temperature outside the bulbs is kept at a constant 32 degrees Fahrenheit. Well-known laws of physics tell us that each bulb contains approximately 300 billion trillion atoms of Argon. Open the stopcock. Even if you have never taken a science course in your life.....look carefully at the symmetry of the situation......does it not seem reasonable - all things being equal - that for every atom of Argon that travels from A into B.....that in a very short period of time another atom of Argon will travel from B into A? Could you reasonably imagine a scenario where 300 billion trillion atoms from A would all travel into B.....while all the Argon atoms in B would stay put in B? The latter scenario is what a complete reversal of entropy would be......and, while I am not going to try to do the calculation......even a temporary imbalance between A and B of 1,000,000 atoms would be extraordinarily improbable.
Re: Chaos, complexity and entropy [for non-physicists]
ducksauce9 wrote:Benoit B. Mandelbrot.
The B stands for Benoit B. Mandelbrot.
Mandelbrot's home page: Benoit B. Mandelbrot - There's a lot of reading material freely available, check the left-side menu.
Here's one for finance: Fractal and Multifractal Finance
Update: To address Leeraar's post below, I forgot to mention that Mandelbrot passed away in 2010. Here's an extensive Benoît Mandelbrot obituary.
Re: Chaos, complexity and entropy [for non-physicists]
By the way, Mandelbrot passed away in 2010.
I think the caution from Mandelbrot, Taleb, and others is that there is a human desire to equate things we don't know to things we do know, to raise our confidence that we understand them better. This leads us to underestimate unanticipated risk, or "fat tails", or "black swans".
That's about it.
L.
I think the caution from Mandelbrot, Taleb, and others is that there is a human desire to equate things we don't know to things we do know, to raise our confidence that we understand them better. This leads us to underestimate unanticipated risk, or "fat tails", or "black swans".
That's about it.
L.
You can get what you want, or you can just get old. (Billy Joel, "Vienna")
Re: Chaos, complexity and entropy [for non-physicists]
Well. This was a bit of a walk down memory lane, since Michel Baranger was my PhD advisor!protagonist wrote:I have often referred to these concepts in this forum as a counter-argument to those who believe in the value of economic models based on limited past data regarding long-range future prediction of success.
I stumbled upon this excellent, simplified discussion of these phenomena entitled "Chaos, Complexity, and Entropy: A physics talk for non-physicists", by Michael Baranger of MIT: http://necsi.edu/projects/baranger/cce.pdf
By the way, his name was the French name "Michel", not the English name Michael. He had a hilarious bit he'd do when introducing himself to Americans who would say "Michelle? But that's a girl's name!"
The article is very much in his style (ca 2000, several years after I'd graduated). He loved explaining things for non-technical audiences, and sometimes succeeded at it. He also was an excellent teacher. (Though when I was TA'ing his graduate course in quantum mechanics one year, they took to calling him "Beldar the Conehead" because he was bald, claimed to be from France, and said somewhat incomprehensible things. But they came around to his way of thinking by the end of the semester.) He could range from this level of verbal description for popular audiences, to picking integral representations of Bessel functions out of the conversational air as though he knew that's what you were already thinking. He wasn't trying to be intimidating; he just spent his life thinking about stuff that became second nature to him.
Unfortunately, he died last fall, at age 87.
Re: Chaos, complexity and entropy simplified
zeugmite wrote:The trouble is physicists by choice only allow the possibility of a forgetting observer to be outside a closed system. They don't allow a learning observer to be, because learning requires actually constructing some states. Forgetting, well, who knows what you did know, before you forgot. Maybe you didn't know anything. That is the entirety of the Second Law, IMHO.
Murray Gell-Mann wrote:Imagine how complicated physics would be if particles could think.
Most of my posts assume no behavioral errors.
- FreeAtLast
- Posts: 802
- Joined: Tue Nov 04, 2014 8:08 pm
Re: Chaos, complexity and entropy [for non-physicists]
Epsilon Delta: Ok, you piqued my curiosity.....how did you calculate the 0.0000013 sigma event for the million atom imbalance?.....and let's postulate that before the stopcock is opened.....there are exactly the same number of Argon atoms in bulb A as bulb B.
Illegitimi non carborundum.
Re: Chaos, complexity and entropy [for non-physicists]
You can approximate the situation by saying that each atom is headed toward bulb A or bulb B. So it's equivalent to flipping a coin Avagadro's number (6E23) of times. The uncertainty in the number of "heads" is the square root of the expectation value, in other words about 6E11. So an imbalance of 1E6 atoms moving one direction is several orders of magnitude less than the uncertainty.FreeAtLast wrote:Epsilon Delta: Ok, you piqued my curiosity.....how did you calculate the 0.0000013 sigma event for the million atom imbalance?.....and let's postulate that before the stopcock is opened.....there are exactly the same number of Argon atoms in bulb A as bulb B.
In reality the problem is a little more complicated: the atoms are moving in three dimensions, and geometric factors come into play (how separated are the two flasks, and what is the width of the tube connecting them?)
Most of my posts assume no behavioral errors.
-
- Posts: 930
- Joined: Wed Apr 16, 2014 8:11 pm
- Location: Flyover Country
Re: Chaos, complexity and entropy [for non-physicists]
Thanks for posting the article.
Do you or does anyone know if there is a video or audio of the talk or any such talk available?
Do you or does anyone know if there is a video or audio of the talk or any such talk available?
I don't know anything.
Re: Chaos, complexity and entropy [for non-physicists]
The paper suggests, to me, that the very act of perceiving an event actually increases disorder/uncertainty by implicating the observer in the event itself.
Is that correct?
Is that correct?
Re: Chaos, complexity and entropy [for non-physicists]
This is the most interesting thing I've read in quite some time - thanks for posting! Bookmarking to follow the discussion and to continue learning from all of you.
Re: Chaos, complexity and entropy [for non-physicists]
Try these:
http://www.ted.com/talks/dan_cobley_wha ... _marketing
http://www.ted.com/talks/benoit_mandelb ... _roughness
Also, search on: video second law
L.
http://www.ted.com/talks/dan_cobley_wha ... _marketing
http://www.ted.com/talks/benoit_mandelb ... _roughness
Also, search on: video second law
L.
You can get what you want, or you can just get old. (Billy Joel, "Vienna")
Re: Chaos, complexity and entropy [for non-physicists]
Yes.jstash wrote:The paper suggests, to me, that the very act of perceiving an event actually increases disorder/uncertainty by implicating the observer in the event itself.
Is that correct?
Well, maybe. See the Heisenberg Uncertainty Principle*, and the definition of a "closed system" for entropy and thermodynamics. Generally, the act of observing interferes. Whether it increases disorder, I am not sure**.
L.
* My teenager's room: If you can find an article of clothing, you don't know if it's clean. If you know it's clean, you can't find it.
** Maybe if people did not check stock prices all the time, there would be less volatility.
You can get what you want, or you can just get old. (Billy Joel, "Vienna")
Re: Chaos, complexity and entropy [for non-physicists]
Before the thread is locked, could anyone please explain how the paper is actionable ? What is main point of the paper for non physicists or non financial experts like me ? Thank you.
"The two most important days in someone's life are the day that they are born and the day they discover why." -John Maxwell
-
- Posts: 9242
- Joined: Sun Dec 26, 2010 11:47 am
Re: Chaos, complexity and entropy [for non-physicists]
I can try, ObGyn, to explain how this knowledge has been "actionable" for me (and why I chose this subforum to post this thread).obgyn65 wrote:Before the thread is locked, could anyone please explain how the paper is actionable ? What is main point of the paper for non physicists or non financial experts like me ? Thank you.
Most of what we do in investing is based on assumptions regarding predictability of future events based on past data , and very limited past data at that. I suppose as "rational beings" we have to do that to organize our world, or we would have no basis on which to act. Understanding that the economy (and at a lower level, the market) is a complex system and follows rules applying to complex systems allows me to laugh at my assumptions and see them as resting on very thin ice indeed. I see discussions as to whether to tilt or not to tilt, or whether to withdraw 4% or 3% or 2% of my income annually in my retirement, or how to tweak my asset allocation so as to maximize my wealth when I reach the ripe old age of 100, as frankly ridiculous. I still have to make certain decisions because I am alive. But lack of control over future events is, paradoxically, liberating. Being able to make important decisions and, at the same time, look in the mirror and laugh at yourself , freeing yourself from assumptions that lack feet, makes you more robust. And the more robust you are, the greater likelihood, I believe, that you will survive the next big bang (in this case financially, but you can extend the thought to your life in general) . To me that is BIG. It makes me a much happier and more resilient person. And why do we care about money anyway, other than increasing our happiness?
Will it make me change my AA from 50/50 to 60/40 or increase my international exposure or draw up a budget? No. But I will sleep a lot easier with my decisions. And I will ignore all the books and charts and "experts" that tell me that if I do x,y or z based on the behavior of a given portfolio from 1974-2011 I will stand a 75% chance of becoming 200% richer in the next 40 years. I consider that actionable.
Plus I don't panic. There is little point in panic over things that you know intrinsically you cannot control. Panic is a state where we make our worst decisions, financial or otherwise.
I firmly believe this stuff , and I post things like this to try to help others.
As for the technical discussion re: Maxwell's demon and entropy, I'm just enjoying it and hope the admin's don't remove it. I had no idea there were so many physicists in this forum. I'm learning from them. (The amount of highly intelligent expert participants in this forum from all walks of life astounds me....I never thought this would generate a technical discussion of entropy).
Last edited by protagonist on Thu Jan 29, 2015 9:40 am, edited 8 times in total.
-
- Posts: 9242
- Joined: Sun Dec 26, 2010 11:47 am
Re: Chaos, complexity and entropy [for non-physicists]
My given name is Michael. I entered middle school, and my first French class, just when the Beatles song "Michelle" hit the charts. I can feel his pain.sgr000 wrote:
By the way, his name was the French name "Michel", not the English name Michael. He had a hilarious bit he'd do when introducing himself to Americans who would say "Michelle? But that's a girl's name!"
Re: Chaos, complexity and entropy [for non-physicists]
There's actually an analogy between statistical mechanics and market dynamics:
efficient market <-> thermal equilibrium
free energy <-> alpha
active managers <-> (would be) Maxwell's demons
This analogy popped into my head several years ago. In googling a bit, a number of other people had already thought of it.
efficient market <-> thermal equilibrium
free energy <-> alpha
active managers <-> (would be) Maxwell's demons
This analogy popped into my head several years ago. In googling a bit, a number of other people had already thought of it.
Most of my posts assume no behavioral errors.
- FreeAtLast
- Posts: 802
- Joined: Tue Nov 04, 2014 8:08 pm
Re: Chaos, complexity and entropy [for non-physicists]
baw703916: Aha, you and I are thinking the same way, in that you have to consider the actual physics of the experimental set-up, and not just hypothetical "flipping a coin" mathematics.....for example, what if we increased the size of each bulb by ten times.......and lowered the ambient temperature to minus 100 degrees F.......these actions would significantly reduce the average root mean square velocity of the atoms......and then we made the connecting tube five feet long with an inner diameter of only 0.1 millimeters?.....would the proposed imbalance of one million atoms occur any time within a million years?
BTW, I have no idea what this discussion has to do with everyday fluctuations in the S&P 500.....but it has been a lot of fun for me in agitating some old brain cells that have been quiescent for decades (and now we wait for the thread to be locked).
Edit: And if you only increased the size of the bulbs without reducing the ambient temperature.....the number of times per second that the Argon atoms would enter the tube openings would be decreased......which still reduces the mixing rate between the bulbs.
BTW, I have no idea what this discussion has to do with everyday fluctuations in the S&P 500.....but it has been a lot of fun for me in agitating some old brain cells that have been quiescent for decades (and now we wait for the thread to be locked).
Edit: And if you only increased the size of the bulbs without reducing the ambient temperature.....the number of times per second that the Argon atoms would enter the tube openings would be decreased......which still reduces the mixing rate between the bulbs.
Last edited by FreeAtLast on Thu Jan 29, 2015 2:23 pm, edited 1 time in total.
Illegitimi non carborundum.
-
- Posts: 536
- Joined: Tue Nov 25, 2008 2:34 pm
Re: Chaos, complexity and entropy [for non-physicists]
An excellent post!protagonist wrote:I can try, ObGyn, to explain how this knowledge has been "actionable" for me (and why I chose this subforum to post this thread).obgyn65 wrote:Before the thread is locked, could anyone please explain how the paper is actionable ? What is main point of the paper for non physicists or non financial experts like me ? Thank you.
Most of what we do in investing is based on assumptions regarding predictability of future events based on past data , and very limited past data at that. I suppose as "rational beings" we have to do that to organize our world, or we would have no basis on which to act. Understanding that the economy (and at a lower level, the market) is a complex system and follows rules applying to complex systems allows me to laugh at my assumptions and see them as resting on very thin ice indeed. I see discussions as to whether to tilt or not to tilt, or whether to withdraw 4% or 3% or 2% of my income annually in my retirement, or how to tweak my asset allocation so as to maximize my wealth when I reach the ripe old age of 100, as frankly ridiculous. I still have to make certain decisions because I am alive. But lack of control over future events is, paradoxically, liberating. Being able to make important decisions and, at the same time, look in the mirror and laugh at yourself , freeing yourself from assumptions that lack feet, makes you more robust. And the more robust you are, the greater likelihood, I believe, that you will survive the next big bang (in this case financially, but you can extend the thought to your life in general) . To me that is BIG. It makes me a much happier and more resilient person. [b[u]]And why do we care about money anyway, other than increasing our happiness?
[/u][/b]Will it make me change my AA from 50/50 to 60/40 or increase my international exposure or draw up a budget? No. But I will sleep a lot easier with my decisions. And I will ignore all the books and charts and "experts" that tell me that if I do x,y or z based on the behavior of a given portfolio from 1974-2011 I will stand a 75% chance of becoming 200% richer in the next 40 years. I consider that actionable.
Plus I don't panic. There is little point in panic over things that you know intrinsically you cannot control. Panic is a state where we make our worst decisions, financial or otherwise.
I firmly believe this stuff , and I post things like this to try to help others.
As for the technical discussion re: Maxwell's demon and entropy, I'm just enjoying it and hope the admin's don't remove it. I had no idea there were so many physicists in this forum. I'm learning from them. (The amount of highly intelligent expert participants in this forum from all walks of life astounds me....I never thought this would generate a technical discussion of entropy).
I do see the relevance of your article on this financial forum. I would be extremely surprised if the thread is locked - unless someone dabbles on convincing us that he or she would be able to make entropy a driving force of S&P 500
The finest, albeit the most difficult, of all human achievements is being reasonable.