Unthinkable Thoughts

‘‘There are wavelengths that people cannot see, there are sounds that people cannot hear, and maybe computers have thoughts that people cannot think.’’

Richard Wesley Hamming

Something that is simultaneously trivial and fascinating (to me) is that the deeds of some other person my age will be very different, and sometimes far greater, than my own. Trivial because it’s so obviously true, for all of us. But fascinating because this person has had the same period of time and an almost identical human body to work with, so their accomplishments must have stemmed from differences in their environment and/or thoughts they had that I did not. No doubt the former plays a role, but if we confine our analysis to a college classmate I think we can establish a role for the latter as well: Sitting in the same auditoriums, coming from a similar background, this person somehow achieves a different understanding of the subject matter (and the world).

We apply the term “genius” to those who make important realizations that escaped everyone else, and try hard to explain what made these people special. Why didn’t the Theory of Relativity occur to everyone? While such explanations often emphasize a special combination of talent and opportunity, it also appears that simple birth defects and accidents can produce genius-level ability in the otherwise unremarkable. Based on this, one might propose that our brains normally have barriers which block many thoughts from appearing. But what is the system that determines which thoughts we have? And as a natural extension of this, how do we set ourselves up to have the widest/wisest range of thoughts?

Your thought-subset

As so often happens, Paul Graham has an interesting comment on this: he argues that we’re unable to think clearly about things that are part of our identity (e.g. religion, ancestry, preference for Apple products), and so to expand the range of topics you can productively think about you need to minimize your identity. Thought provoking (*cough*), and a seemingly perfect philosophy if you’re a Buddhist inventor. But it does seem more like a surgeon than a full-on savior: even if true, it only tells us how to remove certain specific blocks from our thinking.

Another proposition comes from an instruction that I wish someone had given my undergraduate self. We students were frustrated with having to cram huge curricula in some courses, and often vented about the folly of closed book exams. The important thing was being able to find information on demand, not memorizing tons of facts you might never need, right? Well, kind of… Now that I’ve spent some time thinking for a living, it’s clear that most of our progress comes from connecting dots. That is, coming up with solutions based on multiple pieces of information. Sure, you could easily look up those same pieces of information, but if they’re not already in your head when you encounter the problem you miss out on the solution. Based on this, a big limitation to what we’re able to think would simply be the quality and quantity of dots already in our heads; the more you already know about, the wider range of thoughts you can have.

To me this seems quite in line with empirical evidence, although it’s also obvious that other factors play a role. For instance, there’s the person who knows a lot of facts but somehow can’t venture into uncertain territory. To quote Hamming again:

If you read all the time what other people have done you will think the way they thought. If you want to think new thoughts that are different, then do what a lot of creative people do − get the problem reasonably clear and then refuse to look at any answers until you’ve thought the problem through carefully how you would do it, how you could slightly change the problem to be the correct one.


So your ability to think clearly plays into it, as does the amount of knowledge you have to draw on. It seems to me that there’s also a huge amount filtering that occurs before thoughts even enter your consciousness. That is, your brain actually processes a multitude of thoughts for every one that you’re aware of, but most of them are discarded almost immediately. Whatever governs this filtering process must have a profound effect on at least our subjective experience on thinking. I don’t know enough neuroscience to say anything really rigorous on this subject, but intuitively it seems possible that the filter is simply synaptic patterns formed by your past experiences. On a side note, maybe that’s where déjà vus come from: subconscious processing leaking slightly into the conscious domain, so that when the thought is presented to consciousness proper it seems to have (indeed has) happened before.

Such a filtering mechanism would certainly constitute a type of biological limit to what thoughts you are able to think. But one could easily imagine more profound limits based on the physiological wiring of our brains. It would naturally follow that different wiring would allow different thoughts. Returning to the original quote from Hamming, I’m sure most would agree that our deterministically programmed computers can’t think human thoughts. But perhaps they (or future versions of them) can think a different type of thoughts, which we in turn aren’t able to.

In other words, the Venn diagram might look like this:

Your thought-subset,  advanced

You might find it difficult to imagine the types of thoughts a computer would have; I certainly do. In order to come up with a decent answer we would need to examine what constitutes a thought, which I’ll leave for another post. But lest we let the barriers in our brains censor the very idea of inhuman thoughts, I’ll end with this reminder from Schopenhauer:

“Every man takes the limits of his own field of vision for the limits of the world.”

Personality osmosis

I believe it was Jim Rohn that said  “You are the average of the five people you spend the most time with”, i.e. that your thoughts and personality (and consequently actions) will be shaped by the people around you. I don’t know if his statement is exactly right, but I’m willing to wager that it’s roughly correct. Which naturally prompts the curious-minded to ask: Why? And so what?

I can see several mechanisms to account for ‘why’. One is that, in my opinion, humans fundamentally want to agree. Or rather that our brains always do what they can to remove us from conflict, one way or another. At the same time, any two people are inevitably going to disagree on certain points, which represents a (mild) conflict that our brains seek to resolve. Many disagreements will be resolved through a bit of discussion, but a few core beliefs might well resist such resolution. What then? Since we’re discussing the people we spend the most time with I’m going to assume that avoiding exposure to the disagreement is not a likely solution. Eliminating further the option of beating our friend to death with a hominid thighbone, it seems to me that our brains would be incentivized to escape the friction by shifting beliefs and assumptions until they are sufficiently aligned with the other person’s. Of course you could argue that spending more time in disagreement simply allows one to rationally appreciate the other person’s point of view, and I fully agree that that’s part of it. But the subconscious ‘conflict avoidance’ behavior shows up in so many other scenarios (and is so evolutionarily obvious) that I think it would be foolish not to ascribe it a part as well.

Another possible mechanism stems from the idea that conversations are an important way for us to process our thoughts. The degree to which this is true for each of us varies1, but I don’t think anyone would deny that in addition to (and sometimes instead of) exchanging information, conversations allow us to permute and examine information we already have. It stands to reason that if we use conversation to process our thoughts (and tend to avoid solitary contemplation when possible), then what we talk about is what we think about. And thus whatever we have the most opportunity to talk about is going to constitute a large part of our thinking.

I don’t know how much this affects our deliberate thoughts, but it definitely plays in when we’re done with work, tired but without a specific agenda. We join our friends or housemates in whatever they happen to be discussing, and this discussion will stick in our heads as we go to bed. The next time we meet these friends, it feels natural to refer back to previous subjects. Before we know it we’ve started caring more about the issues of whatever group we spend time in, and have forgotten some of the things we used to care about. Our values shift a bit, and over time what we know about shifts as well. Conversely, this means that you are less able to develop your understanding of anything that your community is unable or unwilling to discuss (in casual conversation, mind you).


Which brings us to the ‘so what’. Namely: If the people we spend the most time with shape us to such a degree, should we perhaps make more conscious choices about who we spend time with? Only I don’t think this should take the form of ‘politely edit your friends‘, which is how I’ve mainly seen Jim Rohn’s quote treated on the blogosphere. For one thing, I think this advice is likely to fall flat once you close your browser and try to implement. And as far as I can tell organically developing friendships fare a lot better than deliberately created ones. Not to mention that focusing on your present situation seems quite likely to address symptoms rather than root causes of whatever you feel is wrong with your life. My advice would rather be: don’t edit your friends, but set yourself up to make the friends you need2.

As an example: When I was a teenager I decided that I wanted to go to MIT, based simply on the idea that it was the best university in the world for sciency stuff. When it came time to apply I gave the matter more serious thought: I concluded that it was not worth paying a lot of money for the higher quality education MIT would provide, since I could indubitably improve my education free of charge by increasing my own effort. Although this reasoning still rings true, I now feel that it completely misses the point of going to MIT: the added value is not (primarily) what the professors are able to teach you, but who the other applicants are. Anybody that gets into MIT is smart, and ambitious enough to devote time to jumping through the hoops the application involves. These are the people you’ll be spending 4+ years of intellectual maturation and youthful initiative with, and some will likely become lifelong friends. So if the personality osmosis described above is real, pre-selecting for brains and ambition might well make a big difference in what kind of person you end up being. Is that worth a hundred thousand dollars? Maybe.

[1] I imagine that those who are close to me might say that I’m incapable of developing ideas without speaking aloud, and I wouldn’t argue with them. Although perhaps this blog will make their burden somewhat lighter.

[2] Another way to influence your influences would be by spending a lot of time reading; reading a book (and thinking about it) is not entirely unlike a conversation with the author, and you have some seriously smart partners to choose from here. The catch is that you need to spend serious amounts of time reading, which lacks certain rewards for a social animal, and that you need to work a bit harder at establishing a real conversation.

Source Criticism – Part 1

Did you notice?

Genius is a will-o’-the-wisp if it lacks a solid foundation of perseverance and fanatical tenacity. This is the most important thing in all of human life.

Thomas Edison

I would like to ask you to take say 30 seconds to think about the above statement. Compare it to your own experiences and convictions, then score it on a 10-point scale according to how strongly you agree with it.


Edison, the quintessential American inventor, is often held up as a paragon of industriousness and perseverance. He supposedly tried about a thousand different light bulb designs before finding one that worked, and allegedly slept only four hours per night. He did not, however, make the above statement. Adolf Hitler did. Try reading it again, imagining a pre-war Hitler as the speaker. Do you think you’d have given the same score if you’d had this visual in mind the first time?

Now, it’s not really surprising that we’re swayed by appearances; the halo effect is a well-documented phenomenon. What’s interesting is whether this mental shift in fact happens before our brains start consciously processing information, and creates a bias that the brain actively rationalizes away. That’s how it feels when I think about my own experiences of this sort, and the implication would be that when making such biased choices, we haven’t the faintest notion that we’re doing so. As we all know from the communications-breakdown of a quarrel, irrationality is a lot worse when we think we’re being rational.

This isn’t really (a lack of) source criticism in the traditional sense, but it is in the sense of basing decisions on “knowledge” without asking where it came from. But so what? Does it actually affect our lives if we occasionally do things without paying attention to what prompted the action?1 When we’re doing something where source criticism matters, like searching for information on the internet, we’re all aware . But then again this discussion isn’t about things we realize, but the things we do without even noticing it…

Try thinking of the last article you read online. Doesn’t matter whether it was about avoiding stress, the value of college educations, financial developments or whatever. Stop for a second, and try remembering as much as you can about it. All done with that? OK, Who wrote it? Not “where did you read it?”, but who actually wrote the article? If you’re like me, you won’t remember. You might argue that it doesn’t matter who wrote it since the site is legitimate, and I’ll address this in part 2. But for now let’s just agree that you don’t know or remember who wrote the article. Now, let me ask you this: do you ever, in casual conversation, say something like “I can’t remember exactly where I read this, but…”? I certainly do. And what happens when we do this is that some information from the article is passed on to our friends, but we replace the unidentified website as the source of it.

So what’s the big deal? Of course not everything we say in casual conversation should be taken as gospel. But we evaluate the statements we hear and accept them based on their merits, right? Right. But this is not about the things we do when we’re paying attention, and attentive listening is not the only what that we ‘learn’ things. I would argue that there’s a body of information, let’s call it the collective consensus, that we regularly draw on largely without realizing it. Anything “they say…” falls into this category, as do things that you “just know”. And the collective consensus does affect how we lead our lives. For example: in California, we know that it’s best to limit your gluten intake, and gluten sensitivity is quite common. In consequence, there are a large number of gluten-free products available. In Berlin it is more difficult to find gluten-free scones, and (in my experience) quite rare to hear anyone talking about such. Whether this means that gluten isn’t a problem or that the Germans aren’t aware of it won’t be discussed here, because the point is merely that the information that comprises the collective consensus does in practice shape (each) society.

How does information enter the collective consensus? I won’t even try to give a full answer to that here, but certainly one way is through our casual conversations. Now, the problem isn’t when overtly biased stuff tries to force it’s way into the collective consensus, since that is pretty effectively filtered out. Not the stuff we notice, in other words. But continuing from the example above, there are mechanisms by which information can get in without us noticing it. Here’s an example of how this might happen: My friend’s dad sends her a book on seafood depletion. She reads it but isn’t quite convinced, so she keeps eating seafood and quickly forgets about it. But then, six weeks later she goes: “I read somewhere that  there’s 26 pounds of bycatch for every pound of shrimp, and it’s all just thrown away…” Since I trust my friend, this factoid enters my subconscious as ‘semi-true’, and after that happens a couple of times the force of repetition means that I “just know” that every bowl of shrimp means two gallons of dead fish2.

Notice that in the first step my friend doesn’t read the book because of its credentials, but because her mother suggested that she do so. When she tells me she doesn’t remember the source, and thus assumes that position herself. And by the time it’s entered by subconscious, I couldn’t possibly say where it came from. So in each step the requirement and/or possibility for source criticism is removed3. Why is this problematic? The original information wasn’t necessarily wrong, but the fact that it’s possible for it to enter the collective consensus without critical evaluation, and without anyone noticing it’s infiltration, allows useless or destructive information to get in. An example of this causing trouble is the pursuit of alchemy throughout the 17th century; what might those natural philosophers have achieved if they hadn’t been chasing the Philosopher’s Stone?

So what do we do? There isn’t an obvious solution, as far as I can tell. Since all this is about the things we don’t notice happening, any active defense would be useless unless it worked before the infiltration had a chance to happen. One way to do this would be to compulsively demand a source for every piece of information. Well, solution would pretty much preclude the use of the collective consensus, and make a lot of things go a lot slower.Quite possibly this wouldn’t be worth it for you. But then again, if you happen to be doing something where you can’t afford to be wrong, maybe it would.


[1] If nothing else, it certainly affects Israeli prisoners up for parole (And in the spirit of this post, here’s the original paper).

[2] A rather surprising finding from psychological research is that we will apparently believe things that we hear over and over, even when we know the source is not reliable. See here and here. The good news is that this only works when we aren’t really paying close attention. Say, in casual conversation…

[3] If you’re a connoisseur of pickpocketry, you may notice the parallels to a ploy used by groups of thieves: the first lifts the wallet out of your pocket, and let’s it fall to the ground, another picks up the wallet, and passes it to a third who’s walking by. Catching the first in the split-second where he has your wallet is unlikely, and the second isn’t doing anything illegal by picking it up. Nobody’s kept track of it by the time the third person has it, and so the whole group escapes accountability.

Overlooked wisdom

When (and where) my parents were young, the concept of money was pretty uncool. Changing the world was cool, doing interesting things was cool, and (most of all, I suspect) being cool was cool. So that’s what they internalized, and of course it rubbed off. Although I somehow ended up quite frugal, making money never ranked high amongst my goals. And since all I knew about Warren Buffett was that he was one of the world’s richest men, I never paid any attention to him (or Charlie Munger).

Munger & Buffett

From experience, I would say that this kind of dismissal is not too uncommon. Maybe you want to know about success, but since you care more about your art than about money you instead go read what Baryshnikov has to say on the matter. Or maybe in your artistic circles the name Buffett just never came up (at least in that context).

But this post is not about Warren Buffett being overlooked. Rather, it’s the idea that there’s a lot of overlooked wisdom in the world, a lot of insight that goes unnoticed by most people because it’s normally associated with a specific area of interest. But the thing is that the wisdom can often be useful outside of that area. The understanding that made Buffett and Munger billionaires through investing can easily find application in other parts of life. Here is a quote from each of them, by way of illustration.


It’s not given to human beings to have such talent that they can just know everything about everything all the time. But it is given to human beings who work hard at it — who look and sift the world for a mispriced bet — that they can occasionally find one. And the wise ones bet heavily when the world offers them that opportunity. They bet big when they have the odds. And the rest of the time, they don’t. It’s just that simple.


No matter how great the talent or efforts, some things just take time. You can’t produce a baby in one month by getting nine women pregnant.


If we looked at the people who are most successful in various fields, we might find that they have more in common with each other than with the average person in their field. So is it very incredible that we could learn from the leader of a drastically different field? To reverse the artist example above, an avid investor might well have gorged himself on anything written by or on Warren Buffett. But after he has exhausted the shrewdest works of his field, wouldn’t you expect him to receive diminishing returns? At what point would he gain more (even strictly in terms of investing) from reading Seneca or science fiction than another second-hand analysis of Warren Buffett?

Another recent example of this came when I picked up an autobiography from UFC champion Georges St. Pierre. To be honest I wasn’t expecting much: some ghostwritten recap of his awesomeness, designed to extract dollars from existing fans. But I was then quite absorbed by mixed martial arts, and grabbed it on a whim.

Georges St. PierreWhich is fortunate, because it turned out to be a very insightful description of the story behind success. Of the mentality required not just to to undergo ridiculous training regimens, but to do so with complete focus and vision to ensure constant growth. Of psychological pitfalls, and why cockroaches are more impressive than a Tyrannosaurus Rex. And, most notably, of the sacrifices that any wholehearted pursuit of greatness entails.

Some biographies make it seem that champions (in whatever field) lead an essentially normal life but just happen to be magically better than everyone else. Which, to me, is total bullshit. You can be good at many things, but if you want to be better than everyone else in the world you have to step beyond the normal. And at that point something has to give, be that personal relationships or other interests. I had never before read so forthright an account of this, and although the book’s other lessons were not quite as edifying, I did feel that almost all of them could be directly applied to a career in science. A few example quotes:

On innovation, the most important trait for a would-be champion:

Very often, we see leaders lose sight of how they got to where they are: by being and thinking differently from the competition. They make it to first place, and then their thinking changes from seeking innovation to seeking the status quo. They think, I made it to first place, so now I must not change a thing. But change is what got them to the top in the first place! This is because they’re focused on the positive result rather than on the process of success.

On discipline, and self-honesty:

The real test is this one: When you’re alone in a room, when you’re in a private place and nobody else can see you, what do you choose to do? Eat well, or eat poorly? Exercise, or watch television?

On (necessary) sacrifices:

There is no such thing as a normal friendship in my life … I look at the people who are close to me, the ones I refer to as friends, and I wonder: Will I ever have a relationship like his? Will I ever achieve marriage, children, family? Will I ever own a barbecue or have dishes in my cupboards or live life according to the rules that govern masses of individuals?..

If any of this rings true, I heartily recommend reading The Way of the Fight.


We all feel entitled to certain things: Nobody would pay to breathe. We generally feel that we can do whatever we want in the privacy of our home, and sometimes when we’re on vacation. Depending on where and how we grow up, we gradually add new things to the list. Some would say that we’re all entitled to a job, or even to medical care. And there’s definitely a few people who, whether they admit it or not, feel entitled to a life of beaches, cocktails and sleeping in. If nothing else, in many parts of the world we are constitutionally entitled to life, liberty and the pursuit of happiness.

But are we entitled to hold an opinion?

Most of us would say yes. Charlie Munger would say no, although we can earn the right:

“I never allow myself to have an opinion on anything that I don’t know the other side’s argument better than they do.”

Munger has an impressive intellect, which is probably why he’d venture to make a statement with such strong implications. But it also means that it’s worth taking a minute to consider what he is saying. Is it possible that holding opinions indiscriminately is bad for you?

A discussion of what we’re entitled to could become rather philosophical, but let’s examine what practical implications it can have to hold an opinion on something. In other words, what do our opinions do for us?

To my eyes the main effect seems to be simplifying cognitive processing: If you’ve already filed electronic dance music in your brain’s “bad” folder, you can very quickly deal with anything that falls into that category. If you know that you prefer fish over beef, you save a lot of time in restaurants and supermarkets. So your suite of opinions acts as a paradigm for going through life, giving you a set of assumptions that save time and mental effort.

How does it work out in practice? I don’t smoke because I consider it expensive and dangerous. I don’t know if this opinion would quite stand up to Munger’s criterion. But if someone told me that $7 for a pack of cigarettes isn’t a big deal, I could rely on my understanding of opportunity cost and compound interest to realize that by not smoking I could buy, not one but, two rather nice cars over a ten-year period, or a rather nice house if I wait 40 years1. And if that person told me that their uncle smoked for 40 years and didn’t get cancer, I would know that anecdotal evidence is meaningless for events that are at least partially random and that all we can effectively do in life is skew the averages. So I think it works out in this case. Another example: with no limit on holding opinions, I would probably have said that if you are intelligent you will end up being successful in some field or other. Why do I think so? For one thing I admire intelligent people. And society in general often equates success with intelligence. But if I’m being honest this opinion is not very well-founded at all. I’ve never looked at proportions of intelligent vs. random people being successful in a large dataset, and it’s clear that there are many successful people who are not the most intelligent (even within scientific research). I’m not saying that intelligence doesn’t correlate with success, but if I’m not sure about it what do I gain from holding the opinion? I can’t think of anything offhand, whereas living by this opinion would risk ignoring other factors that contribute to success. Going back to Munger, I would probably be better off by not allowing myself to hold this opinion.

So opinions can help speed up our cognitive processing, but when they aren’t based on serious thinking and research they can lead us to the wrong conclusions. Worse, because of confirmation bias we will invariably pay more attention to things that seem consistent with our opinions (even when the opinion was originally based on nothing) and can thus end up quite irrational. Whereas not holding any opinion that we didn’t develop scrupulously will prompt us to analyze things with a bit less bias.

Why do we love our opinions so much? For one thing, our brains appear to be unexpectedly fond of minimizing effort (for a fascinating treatise on this I heartily recommend Thinking, Fast and Slow by Daniel Kahneman). But perhaps another reason is that opinions help define our sense of self. When we demarcate our likes and dislikes we are effectively constructing our self-image, whether we intend to or not. This may explain why, in addition to holding them, we feel compelled to express our opinions whenever they appear (even remotely) relevant. To repeat our previous analysis, what does this do for us? Is it beneficial to establish and reinforce your self-image? I wouldn’t venture to say, but it does seem to directly stimulate the pleasure centers of the brain. On the other hand, I’m probably not the only one who has offered an opinion during conversation and quickly realized that what I said was for my own benefit rather than a real attempt to understand and relate to the other person. With a bit of misfortune this turns the conversation into ‘parallel monologues’, and frankly I would be reproachable for making this happen. And though we may not be doing ourselves any immediate disservice when we thrust our opinions out into the world wide web, we are still subjecting the world to them. Though it is given to anyone to ignore them, we seem to have a natural inclination to trust what we read. Even Marcus Aurelius had to remind himself that “everything we hear is an opinion, not a fact”. If what we merely express as opinion can become rooted as fact, perhaps it behooves us to give our opinions greater scrutiny?

This is not meant to admonish anyone. After all, I have little legitimate basis on which to judge the opinions of people I do not know. Rather I would present a hypothesis that changed my thinking a bit, and which I find to have some value. At the same time, this ought to set a standard that I wish to hold each of my posts to; if at any point I post something that hasn’t been considered carefully, I sincerely hope that someone will be able to call me out on it.

Thanks to Shane Parrish at Farnam Street (and by extension to Charlie Munger) for reminding me to speak only after much thought, and extending this concept even to holding an opinion.

[1] Average smoker in California goes through a pack per day, so 7 packs/week ~$49, or $25,480 over ten years, with an additional opportunity cost of $13,510 compared to investing the same amount at the historical S&P500 rate of returns of 8%. Total gains after 40 years would be about $750,000.

Last updated by at .