This builds upon "Fooled by Randomness" - some of the concepts reappear but in greater depth, and a lot new.
We evolved in East Africa (presumably) to handle a much simpler world - fewer sources of Black Swans - e.g. a new enemy, a new dangerous animal, a difficult weather change. In that world one can make quick inferences and generally be largely correct. Now there are many more sources and many take much longer to play out.
Similar - we automatically seek a cause for occurrences - helpful and efficient but significant risk of being misleading. Initially valuable for efficiency in a simpler environment.
Similar - categorizing and classifying - much happens unconsciously, it seems. Misleads all too often.
Similar - we love narratives and explanations - takes much more effort to store long strings of random information. Our brains unconsciously summarize, theorize, simplify - evolved way of gaining efficiency - again, probably worked well in the long stage of evolution, involves more hazards now
Positive and tiring effort is required to avoid these wired-in impulses. We can't consistently do so. Keep reminding oneself.
Not being conscious that there are huge swaths of information outside of what we are aware of. However comprehensive we consider our knowledge to be on any given subject - it's pathetically incomplete. Need to create the habit of keeping this in mind, would help drive humility as to what we do and can know.
We can often know with confidence that propositions are untrue. We can seldom if ever know with confidence that propositions are true. Remember this.
Difficulty or impossibility of identifying causation when looking backward. He mentions the popular business books and states in a better way the problem that caused me to stop reading them - a survivor company looking back in hindsight - even if not perniciously, it's made up.
Looking forward also difficult or impossible - forecaster track records are awful. It's OK if one is aware of this!
Futurist writers with the same problem - great at authoritatively telling why things happened a certain way in the past, present/future not so much - I wrote this in 2007 as losing interest in such writers.
He loves Montaigne. (Also here.)
Hayek. Bottoms-up knowledge.
Off topic in a way, but Taleb adopted an unusual exercise and diet approach - aping primitive man, thinking so many years of evolution in that state suggested modern approaches aren't optimal. I won't copy this.
It's a little frustrating to hear such a good explanation about limits of knowledge (and limits of knowing), but with little in the way of recommendations about what to do about this. As well-received as the book was, apparently I'm not the only reader that felt this way. This edition includes an ~80 page additional chapter through which Taleb took on this complaint . . . though in the end I didn't pick up much in the way of specific action items.
The value is improving one's ability to keep an open mind, and one's overall approach to recognizing problems and humbly approaching the best response path while recognizing the limits of knowing and remaining flexible as new information filters in.
That's a lot of value.
[Writing about 1987 crash - no internet, we checked market status via phone in those days; not clear what happened or why recovery was so quick. Later mentions LTCM collapse - Black-Scholes - client companies used their modeling to value stock options. Big episodes for folks my age.]
No comments:
Post a Comment