I recently got into an argument involving topics like whether water was wet and whether hotdogs were sandwiches. What all the topics had in common was that they were not arguments over facts but rather arguments over definitions. This means there is nothing in observable reality that can be pointed to in order to resolve them. What’s worse is that many arguments will fall into this category by default.
The problem is that until all relevant definitions are agreed upon, an argument will often not be about facts. As long as one definition isn’t agreed upon, the participants can make two different statements using the same exact words, which is terrible for communication. If “a hotdog is a sandwich” is interpreted by one person as “a hotdog is meat on bread” and by another person as “a hotdog is meat between two pieces of bread”, they may both think the other person is crazy. If they realize that the dispute is really over definitions, they may still believe that it is unreasonable to use any definition besides the one they use.
Humans are generally thought of as the most intelligent beings to exist, as we have discovered nothing that appears to be more intelligent than us. But how intelligent are we compared to the theoretically attainable maximum? I believe that we are far less intelligent than is physically possible. This is because we possess roughly the minimum level of intelligence necessary to create a technological civilization.
To understand why this is so, we must look at how we came to have this level of intelligence in the first place. Over a period of about 50 million years, the brains of rat-like creatures evolved into the brains of chimps. This was a complex process that involved major structural changes. However, going from chimp brains to human brains only took about 5 million years and required only minor changes to brain structure, suggesting a faster but more gradual process that was unlikely to end at the exact time civilization started. So if one were to ask the question of why we stopped at our current level of intelligence, I would argue that we haven’t necessarily. Instead, we’re at this level because it allowed us to begin building civilization, a process that occurs on a much faster timescale than evolution.
One concept I find particularly fascinating is that of the singleton. As defined by Nick Bostrom, a singleton is “a world order in which there is a single decision-making agency at the highest level”. He states that such an agency would have “the ability to prevent any threats (internal or external) to its own existence and supremacy”. However, it is not at all obvious to me that this is true. To crystallize my intuition I would like to introduce my own concept: the universal inevitable threat.
An inevitable threat is defined as a threat that eliminates an agency regardless of the actions that agency takes. This leaves open the possibility of delaying the threat for a finite period of time, but beyond that, the agency may only choose how to spend the time it has prior to the threat’s realization. It can neither eliminate the threat nor hold it off indefinitely.
A universal inevitable threat is then defined as an inevitable threat that applies to all possible agencies. A notable property of such a threat is that it cannot be brought about by an agency, as this would imply that the agency had a course of action that would result in the threat not occurring and therefore that the threat was not inevitable relative to that agency. This concept is relevant because if a universal inevitable threat exists, then no agency is able to prevent all threats to itself, including a singleton. Furthermore, several plausible candidates for a universal inevitable threat can be identified.
Colonizing Mars is often talked about in the context of preventing the extinction of humanity. This has led to the unfortunate misconception that colonization results in the abandonment of Earth. I’d like to take the time to explain why this doesn’t happen.
Before we explore just why Earth wouldn’t be abandoned, it will be useful to review the extinction risk argument for colonizing another planet. The argument starts by noting that out of all species to ever live on Earth, 99.9% have gone extinct. It then goes on to point out how most of these extinctions occurred in planet-wide events called mass extinctions. Thus, establishing a species on another planet would be a good way to increase its chances of survival. If a cataclysmic event wipes the species off one planet, it is simply a matter of waiting for the dust to settle before reintroducing them from the other planet.
In my last post, I talked about why individuals’ incentives will prevent video game boycotts from succeeding. Now I’d like to discuss potential solutions (or at least vague outlines for potential solutions) to this problem.
The first idea that comes to mind is looking at historical successes. Two successful boycotts that I’m aware of are the Continental Association and the Montgomery bus boycott. For those who don’t know, the Continental Association was a boycott of British goods by the American colonies prior to the American Revolutionary War, and the Montgomery bus boycott was a protest against the segregated bus system in Montgomery, Alabama during the American Civil Rights Movement. In looking at these examples, I hope to identify both parts of a coordination mechanism: an agreement and an enforcement mechanism.
The Continental Association’s agreement is easy to find. The Association’s articles stated that colonists would refuse to import British goods and American merchants would avoid price gouging to ease the resulting burden. Likewise, the articles provide several enforcement mechanisms. Those violating the agreement were to be publicly ostracized and condemned. Committees of inspection were set up to monitor businesses. Colonies had to cease all dealings with any colony that did not comply. Additionally, violence was employed to force compliance on some occasions. These mechanisms were successful enough for all but one colony to comply up until the war began.