I'm currently working as a Research Scholar at the Future of Humanity Institute. I've previously co-created the application Guesstimate. Opinions are typically my own.
Ah, good to know. Do you have recommendations for other words?
Yea, I also found the claim, as well as a few results from old books before the claim. The name come straight from the Latin though, so isn't that original or surprising.Just because it hasn't been used much before doesn't mean we can't start to use it and adjust the definition as we see fit. I want to see more and better terminology in this area.
The term for the "fear of truth" is alethophobia. I'm not familiar of many other great terms in this area (curious to hear suggestions).
Apparently "Epistemophobia" is a thing, but that seems quite different; Epistemophobia is more the fear of learning, rather than the fear of facing the truth.
One given definition of alethophobia is, "The inability to accept unflattering facts about your nation, religion, culture, ethnic group, or yourself"
This seems like a incredibly common issue, one that is especially talked about as of recent, but without much specific terminology.
I keep seeing posts about all the terrible news stories in the news recently. 2020 is a pretty bad year so far.
But the news I've seen people posting typically leaves out most of what's been going on in India, Pakistan, much of the Middle East as of recent, most of Africa, most of South America, and many, many other places as well.
The world is far more complicated than any of us have time to adequately comprehend. One of our greatest challenges is to find ways to handle all this complexity.
The simple solution is to try to spend more time reading the usual news. If the daily news becomes three times as intense, spend three times as much time reading it. This is not a scalable strategy.
I'd hope that over time more attention is spent on big picture aggregations, indices, statistics, and quantitative comparisons.
This could mean paying less attention to the day to day events and to individual cases.
Thanks! I'll check it out.
I was recently pointed to the Youtube channel Psychology in Seattle. I think it's one of my favorites in a while.
I'm personally more interested in workspace psychology than relationship psychology, but my impression is that they share a lot of similarities.
Emotional intelligence gets a bit of a bad rap due to the fuzzy nature, but I'm convinced it's one of the top few things for most people to get better at. I know lots of great researchers and engineers who repeat a bunch of repeated failure modes, and this causes severe organizational and personal problems as a result.
Emotional intelligence books and training typically seem quite poor to me. The alternative format here of "let's just show you dozens of hours of people interacting with each other, and point out all the fixes they could make" seems much better than most books or lectures I've seen.
This Youtube series does an interesting job at that. There's a whole bunch of "let's watch this reality TV show, then give our take on it." I'd be pretty excited about there being more things like this posted online, especially in other contexts.
Related, I think the potential of reality TV is fairly underrated in intellectual circles, but that's a different story.https://www.youtube.com/user/PsychologyInSeattle?fbclid=IwAR3Ux63X0aBK0CEwc8yPyjsFJ2EKQ2aSMs1XOjUOgaFqlguwz6Fxul2ExJw
Fair point. I imagine when we are planning for where to aim things though, we can expect to get better at quantifying these things (over the next few hundred years), and also aim for strategies that would broadly work without assuming precarious externalities.
There's a fair bit of discussion of how much of journalism has died with local newspapers, and separately how the proliferation of news past 3 channels has been harmful for discourse.
In both of these cases, the argument seems to be that a particular type of business transaction resulted in tremendous positive national externalities.
It seems to me very precarious to expect that society at large to only work because of a handful of accidental and temporary externalities.
In the longer term, I'm more optimistic about setups where people pay for the ultimate value, instead of this being an externality. For instance, instead of buying newspapers, which helps in small part to pay for good journalism, people donate to nonprofits that directly optimize the government reform process.
If you think about it, the process of:
is all really inefficient and roundabout compared to what's possible. There's very little division of expertise among the public for instance, there's no coordination where readers realize that there are 20 things that deserve equal attention, so split into 20 subgroups. This is very real work the readers aren't getting compensated for, so they'll do whatever they personally care the most about at the moment.
Basically, my impression is that the US is set up so that a well functioning 4th estate is crucial to making sure things don't spiral out of control. But this places great demands on the 4th estate that few people now are willing to pay for. Historically this functioned by positive externalities, but that's a sketchy place to be. If we develop better methods of coordination in the future I think it's possible to just coordinate to pay the fees and solve the problem.
For those reading, the main thing I'm optimizing Foretold for right now, is for forecasting experiments and projects with 2-100 forecasters. The spirit of making "quick and dirty" questions for personal use conflicts a bit with that of making "well thought out and clear" questions for group use. The latter are messy to change, because it would confuse everyone involved.
Note that Foretold does support full probability distributions with the guesstimate-like syntax, which prediction book doesn't. But it's less focused on the quick individual use case in general.
If there are recommendations for simple ways to make it better for individuals; maybe other workflows, I'd be up for adding some support or integrations.
[retracted: I read the question too quickly, misunderstood it]
My impression, after some thought and discussion (over the last ~1 year or so), is that people being smarter / predicting better will probably decrease the number of wars and make them less terrible. That said, there are of course tails; perhaps some specific wars could be far worse (one country being much better at destroying another).
As I understand it, many wars in part started due to overconfidence; both sides are overconfident on their odds of success (due to many reasons). If they were properly calibrated, they would more likely partake in immediate trades/consessions or similar, rather than take fights, which are rather risky.
Similarly, I wouldn't expect different AGIs to physically fight each other often at all.