Saturday, March 4, 2023

How We Find and Accept the "Truth"

I thought this piece from J. Sanilac was superb on the topic. I particularly dug his list of twenty "techniques that can help us to navigate the chaotic sea of character and find the truth." 

In his Substack Trust Networks: How We Actually Know Things

  1. Stop pretending ad hominem judgments are irrational and avoidable. Instead, accept that they're necessary and use them consciously and intelligently. As I explained in Part I, even the smartest and best informed person needs to rely on ad hominem judgments for much of his knowledge. If you pretend you can know everything directly, or even that you can suspend judgment for any question you haven't answered directly, you'll only sink yourself deeper into delusion, and your beliefs will be less rather than more accurate.
  2. Don't delude yourself into assuming your own claims will be evaluated entirely on their merits. Instead, accept that you too will be the object of ad hominem judgments, and these judgments will significantly impact the reception of your claims. To be an effective knowledge disseminator you should signal trustworthiness with your track record, your outward presentation, your credentials, your character, and your alliances. Even if he has a brilliant idea, a man with no credentials, no connections, shabby clothes, and poor interpersonal skills will lose far more time lobbying to get that brilliant idea taken seriously than he would have spent if he'd polished his signaling first. Businessmen and politicians understand this intuitively. Unfortunately the kinds of men who invent important new ideas usually do not, perhaps because it's precisely indifference to signaling and partisanship that enables them to discover what other people overlook.
  3. Use the black box method to test claims whose details you can't understand or evaluate directly. Just check the final outputs: if they're good, the information within the box is probably good too; if they're bad, the information within the box is flawed at best, and likely wrong.
  4. Evaluate your sources' record. Do they have a history of making accurate predictions or producing good practical results? Those who've been right in the past are likely to be right again. And you too should try to build a track record of reliability, because in the course of time it will add weight to your claims. However, there's an important caveat, which I'll explain below.
  5. Distrust prestige transfer. Some public figures try to leverage their record of success in one domain to create the impression of reliability in another, unrelated domain. For instance, someone who's distinguished himself as a linguist or chess player might try to transfer the prestige he's accrued in his original field to a new field, like political theory, where it hasn't been honestly earned. But success in different domains requires a different cognitive style, different strengths, a different knowledge base, and years of different experience. It's not uncommon for successful CEOs to have naive views about topics that don't directly impact their business operations. You can't assume the composers of masses are experts in theology, nor vice versa. So prestige transfer should awaken your distrust.
  6. Evaluate your sources' incentives and disincentives. Sources who are incentivized for telling the truth are inherently more reliable. Yet when it comes to topics of public relevance, they rarely exist, and the best one can hope for is to find sources who aren't too highly incentivized for lying. Of course, a few rare people do compulsively seek and tell the truth due to an innate altruistic instinct, but they usually lack the motivation to explore complex topics in depth, and are therefore less useful than one would wish. (As previously mentioned, betting markets are an attempt at solving this problem.)
  7. Evaluate your sources' cognitive styles and use this information to interpret their claims. Some people tend to be paranoid and overstate possible negative outcomes, others are indisciplined and jump on new ideas without thoroughly examining them, others are especially prone to partisanship, others are stubborn and never back down when they're wrong, etc. By identifying the character of a speaker you can estimate the risk he'll make a habitual error or exaggeration, and use this to translate his claims into a more accurate form. For instance, a paranoid thinker can be expected to overestimate risk, treating low probability futures as if they're matters of pressing concern, so when absorbing his warnings (e.g. about asteroids) you should downgrade their urgency.
  8. Build a stable of trusted sources. Because past record is a good indicator of present reliability, it's important to observe your sources over a period of years. This is a time-consuming process by nature, so when you discover a reliable source you should consider him a valuable long-term acquisition.
  9. Lower your confidence in the most popular sources. Sources and claims that are afflicted by higher than average epistemic load are amplified, especially in the social media ecosystem. Because of this the most prominent people are not the most reliable people. You should interpret great popularity among the general public as a negative sign with respect to trustworthiness.
  10. Give obscure outsiders a chance. If you always follow the obvious signals of trustworthiness, like credentials, respectable presentation, and uniformly palatable opinions, you'll sometimes trap yourself in a cul de sac of mutually reinforcing conformists who've shut out dissenters. Once in a while you should make a foray into the wilderness, because obscure and disagreeable outsiders who are ridiculed, denounced, ostracized, and shamed by the mainstream occasionally are right when everyone respectable is wrong. Usually, of course, they're a waste of your time.
  11. Estimate the effect of signaling load and attempt to correct for it. Comb through all your beliefs to determine which function as positive social signals, and lower your confidence in these beliefs. You should lower your confidence even more if you've fallen into the habit of using them as signals yourself. Of course, they might be true; but the expected pattern is for them to be exaggerated in the direction of optimal signaling, and they could even be empty fabrications. You should also raise your confidence in beliefs that send negative signals. It's likely that some of these are correct, but socially unpalatable, and therefore unfairly denounced. Of course, it goes without saying that you should try to avoid overcorrecting. (Note that a dissident subculture isn't immune to signaling load, but rather develops its own local signals that aren't functionally different from those of society at large. Thus, being a dissident, or contrarian, or minority does not in any way exempt you from the need to correct for signaling load.)
  12. Estimate the effect of partisan load and attempt to correct for it. Most people already assume the truth falls somewhere between the extreme statements of opposing factions, so it might seem that correcting for partisan load is as simple as embracing moderation and aiming toward the center. However, this type of lazy centrism isn't actually a good way to find the truth. Political actors are experts at manipulating it. For instance, as we discussed earlier, they can use propaganda to portray their favored views as normal and centrist even if they're partisan minority views in reality. They can also can encourage their extremists to be more extreme in order to move the perceived center closer to their side. (E.g. Trust Network A says the answer is 1, Trust Network B says the answer is -1, a lazy centrist concludes the answer is 0. Whence political operators in Trust Network A can use a common sales tactic to get their way: by overshooting and claiming the answer is 3, they cause lazy centrists to conclude that it's 1, their original desideratum. Because they're vulnerable to this tactic, lazy centrists can actually encourage extremism!) There are, furthermore, plenty of historically verifiable cases where one side turned out to be wholly correct and the other wholly wrong, so that centrism would not have arrived at the truth. Thus, when you try to correct for partisan load, you shouldn't just take a moderate position between two sides and stop there. It's better to analyze the effects of partisan load carefully first.
  13. Don't assume that partisanship as such is bad. Partisan load does degrade the accuracy of our beliefs, but that doesn't necessarily mean you should reject partisanship. The reason partisanship isn't wholly bad, and indeed the reason it's a natural instinct in the first place, is that it's entirely possible—even likely—that an enemy is really your enemy. In other words, one trust network may be a real antagonist whose members really wish to deceive you and do you harm because they have interests that are contrary to yours. Humans are individuals who exhibit tribal coherence. If you insist on being naive and judge everyone only as an individual, you and your allies risk defeat, and in the worst case, even annihilation. Someone who encourages you to ignore partisanship when a genuine conflict is underway is not your friend, but your enemy, or at best a fool. Before rejecting partisanship you should evaluate the whole landscape in detail and choose a side if need be.
  14. Hide or camouflage unpopular views and signals of partisan alignment when trying to communicate to moderates, opponents, and general audiences. If you send the wrong signals or create the wrong associations you'll trigger an immediate rejection of your claims, no matter how good or true they are, because you'll be identified as an enemy and therefore dismissed. One solution to this is to focus narrowly on your issue of interest and avoid addressing other topics entirely. This prevents any controversial or partisan-aligned views you may hold from becoming a divisive distraction and reducing your impact. Another tactic is to advocate for positions that are more moderate than your actual beliefs, pushing for a direction and then pushing again rather than selling your ultimate target up front. Both of these approaches are in common use.
  15. Use your instincts. We have fine intuitions for making ad hominem judgments in context, and the rational judgments we make in the abstract are quite myopic in comparison. Good instincts are a serious asset, so if you have them you should value them. This is not, of course, to say that they can never be wrong.
  16. Use sensory information. Factual information is conveyed most efficiently in text form. However, information about the human subjects who transfer this factual information is conveyed most efficiently in audiovisual form, and some of the information that can be found in appearance and voice is completely absent from text.
  17. Look out for hackers. Look for signs that someone is intentionally manipulating ad hominem signals to induce trust or distrust where they aren't merited. Unfortunately it's not always possible to identify bad actors before they've done harm.
  18. Be forgiving of humans who are in the grip of bad ideas. I'm not so keen on this one myself, dear readers, but I feel at least obliged to mention it in order to signal care. All of us have some wrong and indeed outright stupid ideas we can't recognize as such. This isn't necessarily because we're stupid ourselves, although often that is indeed the case. Rather it's because ad hominem judgments, while unavoidable, are an imperfect source of knowledge, and they can't be relied on to filter out every bad idea percolating through our trust networks. We ought to be forgiving of others who are also in the grip of foolish ideas thus acquired, especially when they're young and inexperienced.
  19. Avoid overconfidence. It feels good to be confident in the beliefs of your trust network. But for the reasons just mentioned, it's inevitable that this confidence will sometimes be misplaced. If you want to form a probabilistically accurate picture of the world you should abstain from the joy of overconfidence, and always remain open to the possibility that some of your beliefs are false. In fact, it's safe to assume that some of your beliefs are false.
  20. Read fiction. As a writer, of course I would tell you to read fiction. So obviously you shouldn't trust me. But the reason I've littered this essay with so many examples is that, outside of real-world experience, narratives are the best means we have for thinking about and understanding ad hominem judgments and trust networks. Trust is the stuff novels are made of. Even cheap soap operas often take trustworthiness and trust networks as their main topic, with the drama unfolding around questions like: who's conspiring with whom, who's really on whose side, who's lying and who's telling the truth? If you keep your nose buried in numbers and make the mistake of dismissing everything else as wordy nonsense, you might end up trusting the wrong people and pay the price for it.