Viliam

Comments

Covid 8/6: The Case of the Missing Data
The second possible explanation is that this is intentional variolation. The goal is to infect the whole team now so they’ll be ready for the season.
Think about it. It’s much less crazy than it sounds.

The crazy part will be watching people who can't breathe trying to play football.

Mati_Roy's Shortform

Aren't there already too many people in prisons? Do we need to put there also people who normally wouldn't have done any crime?

I guess this depends a lot on your model of crime. If your model is something like -- "some people are inherently criminal, but most are inherently non-criminal; the latter would never commit a crime, and the former will do use every opportunity that seems profitable to them" -- then the honeypot strategy makes sense. You find and eliminate the inherently criminal people before they get an opportunity to actually hurt someone.

My model is that most people could be navigated to commit a crime, if someone would spend the energy to understand them and create the proper temptation. Especially when we consider the vast range of things that are considered crimes, so it does not have to be a murder, but something like smoking weed, or even things that you have no idea they could be illegal; then I'd say a potential success rate is 99%. But even if we limit ourselves to the motte of crime; let's say theft and fraud, I'd still say more than 50% of people could be tempted, if someone spent enough resources on it. Of course some people are easier to nudge than others, but we are all on the spectrum.

Emotionally speaking, "entrapment" feels to me like "it is too dangerous to fight the real criminals, let's get some innocent but stupid person into trouble instead, and it will look the same in our crime-fighting statistics".

Sunny's Shortform
By the way, it's just popped into my head that I might benefit from doing an adversarial collaboration with somebody about Occam's razor. I'm nowhere near ready to commit to anything, but just as an offhand question, does that sound like the sort of thing you might be interested in?

Definitely not interested. My understanding of these things is kinda intuitive (with intuition based on decent knowledge of math and computer science, but still), so I believe that "I'll know it when I see it" (give me two options, and I'll probably tell you whether one of them seems "simpler" than the other), but I wouldn't try to put it into exact words.

The New Scientific Method

I am fascinated by the amount of effort you put into writing a comment.

I understand the call of duty and sometimes spend 30 minutes writing and editing a short comment. But designing an algorithm to prove a concept, and writing an application... wow!

(Could you perhaps expand the comment into a short article? Like, remove all the quarrel, and just keep it like: "this is a simple algorithm that achieves something seemingly impossible"; perhaps also with some pseudocode. Because this is interesting per se, in my opinion, even for someone who hasn't read this thread.)

Medical Diagnostic Imaging, Leukemia, and Black Holes

Uhm. Not sure how to put this, but seems like remaining silent would be a worse option...

Looking at your other articles, it seems to me like you came here with many strong opinions you want to share. You keep posting long monologues, and you get some response to them, but I suspect that there are more articles already written and waiting to be published, regardless of the feedback you might get here.

Which wouldn't be a problem if your articles were positively accepted (i.e. upvoted) by the community... but they are not. Barely anyone votes on them, and the average feedback is negative.

I wish I could provide a more detailed feedback, but the fact is that these articles are just too long for me. Even length is not exactly the problem, it's like... for example, this article is about medical diagnostic imaging, which is a topic I know almost nothing about, your arguments are not easy for me to verify, maybe you are right about something important, or maybe you are completely wrong... I can't tell. Generally, if I can't tell, I don't vote. But I suspect that most readers here are in the same position. But then, this is not the right audience for you, and it is not the right content for this website.

If you want actual engagement with your ideas, I suggest to take it much slower. Don't put dozen claims in a single article, because almost no one will read them all. (Ironically, you mention "gish gallop" in the same article than contains links to proteins, group delusion, lithium, gravitational waves, leukemia diagnostics, sunscreen, Triclosan, spine surgery, Gadolinium, boron, flouride, heart attack, neutrinos, Higgs particle, satellites, balloons, laser, fusion, climate, planets, and expansion of the universe.) If you focus on one thing, there is a chance you may find people who understand it and can have a debate with you. Otherwise, you are just wasting time here.

Sherrinford's Shortform

In theory, Wikipedia strives to be impartial. In practice, the rules are always only as good as the judges who uphold them. (All legal systems involve some degree of human judgment somewhere in the loop, because it is impossible to write a set of rules that covers everything and doesn't allow some clever abuse. That's why we talk about the letter and the spirit of the law.)

How to become a Wikipedia admin? You need to spend a lot of time editing Wikipedia in a way other admins consider helpful, and you need to be interested in getting the role. (Probably a few more technical details I forgot.) The good thing is that by doing a lot of useful work you send a costly signal that you care about Wikipedia. The bad thing is that if certain political opinion becomes dominant among the existing admins, there is no mechanism to fix this bias; it's actually the other way round, because edits disagreeing with the consensus would be judged as harmful, and would probably disqualify their author from becoming an admin in the future.

I don't assume bad faith from most of Wikipedia editors. Being wrong about something feels the same from inside as being right; and if other people agree with you, that is usually a good sign. But if you have a few bad actors who can play it smart, who can pretend that their personal grudges are how they actually see the world... considering that other admins already see them as part of the same team, and the same political bias means they already roughly agree on who are the good guys and who are the bad guys... it is not difficult to defend their decisions in front of jury of their peers. An outsider has no chance in this fight, because the insider is fluent with local lingo. Whatever they want to argue, they can find a wiki-rule pointing in that direction; of course it would be just as easy for them to find a wiki-rule pointing in the opposite direction (e.g. if you want to edit an article about something you are personally involved with, you have a "conflict of interest", which is a bad thing; if I want to do the same thing, my personal involvement makes me a "subject-matter expert", which is a good thing; your repetitive editing of the article to make your point is "vandalism", my repetitive editing of the article to make an opposite point is "reverting vandalism"); and then the other admins will nod and say: "of course, if this is what the wiki-rules say, our job is to obey them".

The specific admin that is so obsessed with Less Wrong is David Gerard from RationalWiki. He keeps a grudge for almost a decade, when he added Less Wrong to his website as an example of pseudoscience, mostly because of the quantum physics sequence. After being explained that actually "many worlds" is one of the mainstream interpretations among the scientists, he failed to say oops, and continued in the spirit of: well, maybe I was technically wrong about the quantum thing, but still... and spent the last decade trying to find and document everything that is wrong with Less Wrong. (Roko's Basilisk -- a controversial comment that was posted on LW once, deleted by Eliezer along with the whole thread, then posted on RationalWiki as "this is what people at Less Wrong actually believe". Because the fact that it was deleted is somehow a proof that deep inside we actually agree with it, but we don't want the world to know. Neoreaction -- a small group of people who enjoyed debating their edgy beliefs on Less Wrong, were considered entertaining for a while, then became boring and were kicked out. Again, the fact that they were not kicked out sooner is evidence of something dark.) Now if you look who makes most edits on the Wikipedia page about Less Wrong: it's David Gerard. If you go through the edit history and look at the individual changes, most of them are small and innocent, but they are all in the same direction: the basilisk and neoreaction must remain in the article, no matter how minuscule they are from perspective of someone who actually reads Less Wrong; on the other hand, mentions of effective altruism must be kept as short as possible. All of this is technically true and defensible, but... I'd argue that the Less Wrong described by the Wikipedia article does not resemble the Less Wrong its readers know, and that we have David Gerard and his decade-long work to thank for this fact.

If the impression of lesswrong is distorted, then this may be a problem of what kinds of thing on lesswrong are covered by media publications?

True, but most of the information in media originates from RationalWiki, where it was written by David Gerard. A decade ago, RationalWiki used to be quite high in google rankings, if I remember correctly; any journalist who did a simple background check would find it. Then he or she would ask about the juicy things in the interview, and regardless of the answer, the juicy things would be mentioned in the article. Which means that the next journalist would now find them both at RationalWiki and in the previous article, which means that he or she would again make a part of the interview about it, reinforcing the connection. It is hard to find an article about Less Wrong that does not mention Roko's Basilisk, despite the fact that it is discussed here rarely, and usually in the context of "guys, I have read about this thing called Roko's Basilisk in the media, and I can't find anything about it here, could you please explain me what this is about?"

Part of this is the clickbait nature of media: given the choice between debating neoreaction and debating technical details of the latest decision theory, it doesn't matter which topic is more relevant to Less Wrong per se, they know that their audience doesn't care about the latter. And part of the problem with Wikipedia is that it is downstream of the clickbait journalism. They try to use more serious sources, but sometimes there is simply no other source on the topic.

AllAmericanBreakfast's Shortform

Those opinions often have something in common -- respect for the scientific method, effort to improve one's rationality, concern about artificial intelligence -- and I like to believe it is not just a random idiosyncratic mix (a bunch of random things Eliezer likes), but different manifestations of the same underlying principle (use your intelligence to win, not to defeat yourself). However, not everyone is interested in all of this.

And I would definitely like to see "somebody friendly, funny, empathic, a good performer, neat and practiced" promoting these values in a YouTube channel or in books. But that requires a talent I don't have, so I can only wait until someone else with the necessary skills does it.

This reminded me of the YouTube channel of Julia Galef, but the latest videos there are 3 years old.

Food Spending During Covid
The house grocery budget won't count a restaurant meal, but it will count the food you eat at home because you didn't go out to eat.

I didn't run the numbers, but I believe that the total "house grocery budget + restaurant means" is much lower these days for me. Like, instead of eating alone in the restaurant for 5 €, I can cook for my entire family for 2-3 €. The total savings are not this dramatic, because only the cooked meals are cheaper; things like cheese and chocolate cost the same as before.

It is interesting when I look at costs of individual food items. Things that I eat in order to not be hungry, most of them are quite cheap (if I cook them at home). Things that I eat only because they taste good, they actually make most of the budget. Eating outside home means everything tastes very good, and everything is expensive.

Looking from the opposite perspective: if instead of going to restaurants you start to eat at home, you may or may not save lots of money, depending on how much you cook.

AllAmericanBreakfast's Shortform

Traditionally, things like this are socially achieved by using some form of "good cop, bad cop" strategy. You have someone who explains the concepts clearly and bluntly, regardless of whom it may offend (e.g. Eliezer Yudkowsky), and you have someone who presents the concepts nicely and inoffensively, reaching a wider audience (e.g. Scott Alexander), but ultimately they both use the same framework.

The inoffensiveness of Scott is of course relative, but I would say that people who get offended by him are really not the target audience for rationalist thought. Because, ultimately, saying "2+2=4" means offending people who believe that 2+2=5 and are really sensitive about it; so the only way to be non-offensive is to never say anything specific.

If a movement only has the "bad cops" and no "good cops", it will be perceived as a group of assholes. Which is not necessarily bad if the members are powerful; people want to join the winning side. But without actual power, it will not gain wide acceptance. Most people don't want to go into unnecessary conflicts.

On the other hand, a movement with "good cops" without "bad cops" will get its message diluted. First, the diplomatic believers will dilute their message in order not to offend anyone. Their fans will further dilute the message, because even the once-diluted version is too strong for normies' taste. At the end, the message may gain popular support... kind of... because the version that gains the popular support will actually contain maybe 1% of the original message, but mostly 99% of what the normies already believed, peppered by the new keywords.

The more people will present rationality using different methods, the better. Because each of them will reach a different audience. So I completely approve the approach you suggest... in addition to the existing ones.

Sherrinford's Shortform

My model is that in USA most intelligent people are left-wing. Especially when you define "left-wing" to mean the 50% of the political spectrum, not just the extreme. And there seem to be many Americans on Less Wrong, just like on most English-speaking websites.

(Note that I am not discussing here why this is so. Maybe the left-wing is inherently correct. Or maybe the intelligent people are just more likely to attend universities where they get brainwashed by the establishment. I am not discussing the cause here, merely observing the outcome.)

So, I would expect Less Wrong to be mostly left-wing (in the 50% sense). My question is, why were you surprised by this outcome?

I don't see where leftwing lesswrongers are denounced as rightwing extremists.

For example, "neoreaction" is the only flavor of politics that is mentioned in the Wikipedia article about LessWrong. It does not claim that it is the predominant political belief, and it even says that Yudkowsky disagrees with them. Nonetheless, it is the only political opinion mentioned in connection with Less Wrong. (This is about making associations rather than making arguments.) So a reader who does not know how to read between the lines properly, might leave with an impression that LW is mostly right-wing. (Which is exactly the intended outcome, in my opinion.) And Wikipedia is not the only place where this game of associations is played.

Load More