Sunny from QAD

Comments

Sunny's Shortform

Aha, no, the mind reading part is just one of several cultures I'm mentioning. (Guess Culture, to be exact.) If I default to being an Asker but somebody else is a Guesser, I might have the following interaction with them:

Me: [looking at some cookies they just made] These look delicious! Would it be all right if I ate one?

Them: [obviously uncomfortable] Uhm... uh... I mean, I guess so...

Here, it's retroactively clear that, in their eyes, I've overstepped a boundary just by asking. But I usually can't tell in advance what things I'm allowed to ask and what things I'm not allowed to ask. There could be some rule that I just haven't discovered yet, but because I haven't discovered it yet, it feels to me like each case is arbitrary, and thus it feels like I'm being required to read people's minds each time. Hence why I'm tempted to call Guess Culture as "Read-my-mind Culture".

(Contrast this to Ask Culture, where the rule is, to me, very simple and easy to discover: every request is acceptable to make, and if the other person doesn't want you to do what you're asking to do, they just say "no".)

Sunny's Shortform

I couldn't parse this question. Which part are you referring to by "it", and what do you mean by "instead of asking you"?

Sunny's Shortform

The Civ analogy makes sense, and I certainly wouldn't stop at disproving all actually-practiced religions (though at the moment I don't even feel equipped to do that).

Well, you cannot disprove such thing, because it is logically possible. (Obviously, "possible" does not automatically imply "it happened".) But unless you assume it is "simulations all the way up", there must be a universe that is not created by an external alien lifeform. Therefore, it is also logically possible that our universe is like that.

Are you sure it's logically possible in the strict sense? Maybe there's some hidden line of reasoning we haven't yet discovered that shows that this universe isn't a simulation! (Of course, there's a lot of question-untangling that has to happen first, like whether "is this a simulation?" is even an appropriate question to ask. See also: Greg Egan's book Permutation City, a fascinating work of fiction that gives a unique take on what it means for a universe to be a simulation.)

It's just a cosmic horror that you need to learn to live with. There are more.

This sounds like the kind of thing someone might say who is already relatively confident they won't suffer eternal damnation. Imagine believing with probability at least 1/1000 that, if you act incorrectly during your life, then...

(WARNING: graphic imagery) ...upon your bodily death, your consciousness will be embedded in an indestructible body and put in a 15K degree oven for 100 centuries. (END).

Would you still say it was just another cosmic horror you have to learn to live with? If you wouldn't still say that, but you say it now because your probability estimate is less than 1/1000, how did you come to have that estimate?

Any programming language; for large enough values it doesn't matter. If you believe that e.g. Python is much better in this regard than Java, then for sufficiently complicated things the most efficient way to implement them in Java is to implement a Python emulator (a constant-sized piece of code) and implementing the algorithm in Python. So if you chose a wrong language, you pay at most a constant-sized penalty. Which is usually irrelevant, because these things are usually applied to debating what happens in general when the data grow.

The constant-sized penalty makes sense. But I don't understand the claim that this concept is usually applied in the context of looking at how things grow. Occam's razor is (apparently) formulated in terms of raw Kolmogorov complexity -- the appropriate prior for an event X is 2^(-B), where B is the Kolmogorov Complexity of X. 

Let's say general relativity is being compared against Theory T, and the programming language is Python. Doesn't it make a huge difference whether you're allowed to "pip install general-relativity" before you begin? 

But there are cases where you can have an intuition that for any reasonable definition of a programming language, X should be simpler than Y.

I agree that these intuitions can exist, but if I'm going to use them, then I detest this process being called a formalization! If I'm allowed to invoke my sense of reasonableness to choose a good programming language to generate my priors, why don't I instead just invoke my sense of reasonableness to choose good priors? Wisdom of the form "programming languages that generate priors that work tend to have characteristic X" can be transformed into wisdom of the form "priors that work tend to have characteristic X".

Just an intuition pump: [...]

I have to admit that I kind of bounced off of this. The universe-counting argument makes sense, but it doesn't seem especially intuitive to me that the whole of reality should consist of one universe for each computer program of a set length written in a set language.

(Actually, I probably never heard explicitly about Kolmogorov complexity at university, but I learned some related concepts that allowed me to recognize what it means and what it implies, when I found it on Less Wrong.)

Can I ask which related concepts you mean?

[...] so it is the complexity of the outside universe.

Oh, that makes sense. In that case, the argument would be that nothing outside MY universe could intervene in the lives of the simulated Life-creatures, since they really just live in the same universe as me. But then my concern just transforms into "what if there's a powerful entity living in this universe (rather than outside of it) who will punish me if I do X, etc".

Sunny's Shortform

Epistemic status: really shaky, but I think there's something here.

I naturally feel a lot of resistance to the way culture/norm differences are characterized in posts like Ask and Guess and Wait vs Interrupt Culture. I naturally want to give them little pet names, like:

  • Guess culture = "read my fucking mind, you badwrong idiot" culture.
  • Ask culture = nothing, because this is just how normal, non-insane people act.

I think this feeling is generated by various negative experiences I've had with people around me, who, no matter where I am, always seem to share between them one culture or another that I don't really understand the rules of. This leads to a lot of interactions where I'm being told by everyone around me that I'm being a jerk, even when I can "clearly see" that their is nothing I could have done that would have been correct in their eyes, or that what they wanted me to do was impossible or unreasonable.

But I'm starting to wonder if I need to let go of this. When I feel someone is treating me unfairly, it could just be because (1) they are speaking in Culture 1, then (2) I am listening in Culture 2 and hearing something they don't mean to transmit. If I was more tuned in to what people meant to say, my perception of people who use other norms might change.

I feel there's at least one more important pair of cultures, and although I haven't mentioned it yet, it's the one I had in mind most while writing this post. Something like:

  • Culture 1: Everyone speaks for themselves only, unless explicitly stated otherwise. Putting words in someone's mouth or saying that they are "implying" something they didn't literally say is completely unacceptable. False accusations are taken seriously and reflect poorly on the accuser.
  • Culture 2: The things you say reflect not only on you but also on people "associated" with you. If X is what you believe, you might have to say Y instead if saying X could be taken the wrong way. If someone is being a jerk, you don't have to extend the courtesy of articulating their mistake to them correctly; you can just shun them off in whatever way is easiest.

I don't really know how real this dichotomy is, and if it is real, I don't know for sure how I feel about one being "right" and the other being "wrong". I tried semi-hard to give a neutral take on the distinction, but I don't think I succeeded. Can people reading this tell which culture I naturally feel opposed to? Do you think I've correctly put my finger on another real dichotomy? Which set of norms, if either, do you feel more in tune with?

Sunny's Shortform

But atoms aren't similar to calories, are they? I maintain that this hypothesis could be literally false, rather than simply unhelpful.

Sunny's Shortform

I wouldn't call the dead chieftain a god -- that would just be a word game.

But then, how did this improbably complicated mechanism come into existence? Humans were made by evolution, were gods too? But then again those gods are not the gods of religion; they are merely powerful aliens. But powerful aliens are neither creators of the universe, nor are they omniscient.

Wait wait! You say a god-like being created by evolution cannot be a creator of the universe. But that's only true if you constrain that particular instance of evolution to have occered in *this* universe. Maybe this universe is a simulation designed by a powerful "alien" in another universe, who itself came about from an evolutionary process in its own universe.

It might be "omniscient" in the sense that it can think 1000x as fast as us and has 1000x as much working memory and is familiar with thinking habits that are 1000x as good as ours, but that's a moot point. The real thing I'm worried about isn't whether there exists an omniscient-omnipotent-benevolent creature, but rather whether there exists *some* very powerful creature who I might need to understand to avoid getting horrible outcomes.

I haven't yet put much thought into this, since I only recently came to believe that this topic merits serious thought, but the existence of such a powerful creature seems like a plausible avenue to the conclusion "I have an infinite fate and it depends on me doing/avoiding X".

[...] Occam's razor [...]

This is another area where my understanding could stand to be improved (and where I expect it will be during my next read-through of the sequences). I'm not sure exactly what kind of simplicity Occam's razor uses. Apparently it can be formalized as Kolmogorov complexity, but the only definition I've ever found for that term is "the Kolmogorov Complexity of X is the length of the shortest computer program that would output X". But this definition is itself in need of formalization. Which programming language? And what if X is something other than a stream of bits, such as a dandelion? And even once that's answered, I'm not quite sure how to arrive at the conclusion that Kolmogorov-ly simpler things are more likely to be encountered.

(All that being said, I'd like to note that I'm keeping in mind that just because I don't understand these things doesn't mean there's nothing to them. Do you know of any good learning resources for someone who has my confusions about these topics?)

And it's not like you created the universe by simulating it, because you are merely following the mathematical rules; so it's more like the math created that universe and you are only observing it.

If the beings in that mathematical universe will pray to gods, there is no way for anyone outside to intervene (while simultaneously following the mathematical rules). So the universe inside the Game of Life is a perfectly godless universe, based on math.

That much makes sense, but I think it excludes a possibly important class of universe that is based on math but also depends on a constant stream of data from an outside source. Imagine a Life-like simulation ruleset where the state of the array of cells at time T+1 depended on (1) the state of the array at time T and (2) the on/off state of a light switch in my attic at time T. I could listen to the prayers of the simulated creatures and use the light switch to influence their universe such that they are answered.

Sunny's Shortform

You make a good point -- even if my belief was technically true, it could still have been poorly framed and inactionable (is there a name for this failure mode?).

But in fact, I think it's not even obvious that it was technically true. If we say "calories in" is the sum of the calorie counts on the labels of each food item you eat (let's assume the labels are accurate) then could there not still be some nutrient X that needs to be present for your body to extract the calories? Say, you need at least an ounce of X to process 100 calories? If so, then one could eat the same amount of food, but less X, and potentially lose weight.

Or perhaps the human body can only process food between four and eight hours after eating it, and it doesn't try as hard to extract calories if you aren't being active, so scheduling your meals to take place four hours before you sit around doing nothing would make them "count less".

Calories are (presumably?) a measure of chemical potential energy, but remember that matter itself can also be converted into energy. There's no antimatter engine inside my gut, so my body fails to extract all of the energy present in each piece of food. Couldn't the mechanism of digestion also fail to extract all the chemical potential energy of species "calorie"?

Sunny's Shortform

Thanks for the feedback! Here's another one for ya. A relatively long time ago I used to be pretty concerned about Pascal's wager, but then I devised some clever reasoning why it all cancels out and I don't need to think about it. I reasoned that one of three things must be true:

  1. I don't have an immortal soul. In this case, I might as well be a good person.
  2. I have an immortal soul, and after my bodily death I will be assigned to one of a handful of infinite fates, depending on how good of a person I was. In this case it's very important that I be a good person.
  3. Same as above, but the decision process is something else. In this case I have no way of knowing how my infinite fate will be decided, so I might as well be a good person during my mortal life and hope for the best.

But then, post-LW, I realized that there are two issues with this:

  • It doesn't make any sense to separate out case 2 from the enormous ocean of possibilities allowed for by case 3. Or rather, I can separate it, but then I need to probabilistically penalize it relative to case 3, and I also need to slightly shift the "expected judgment criterion" found in case 3 away from "being a good person is the way to get a good infinite fate", and it all balances out.
  • More importantly, this argument flippantly supposes that I have no way of discerning what process, if any, will be used to assign me an infinite fate. An infinite fate, mind you. I ought to be putting in more thought than this even if I thought the afterlife only lasted an hour, let alone eternity.

So now I am back to being rather concerned about Pascal's wager, or more generally, the possibility that I have an immortal soul and need to worry about where it eventually ends up.

From my first read-through of the sequences I remember that it claims to show that the idea of there being a god is somewhat nonsensical, but I didn't quite catch it the first time around. So my first line of attack is to read through the sequences again, more carefully this time, and see if they really do give a valid reason to believe that.

Sunny's Shortform

This belief wasn't really affecting my eating habits, so I don't think I'll be changing much. My rules are basically:

  1. No meat (I'm a vegetarian for moral reasons).
  2. If I feel hungry but I can see/feel my stomach being full by looking at / touching my belly, I'm probably just bored or thirsty and I should consider not eating anything.
  3. Try to eat at least a meal's worth of "light" food (like toast or cereal as opposed to pizza or nachos) per day. This last rule is just to keep me from getting stomach aches, which happens if I eat too much "heavy" food in too short a time span.

I think I might contend that this kind of reflects an agnostic position. But I'm glad you asked, because I hadn't noticed before that rule 2 actually does implicitly assume some relationship between "amount of food" and "weight change", and is put in place so I don't gain weight. So I guess I should really have said that what I tossed out the window was the extra detail that calories alone determine the effect food will have on one's weight. I still believe, for normal cases, that taking the same eating pattern but scaling it up (eating more of everything but keeping the ratios the same) will result in weight gain.

Sunny's Shortform

It's happened again: I've realized that one of my old beliefs (pre-LW) is just plain dumb.

I used to look around at all the various diet (Paleo, Keto, low carb, low fat, etc.) and feel angry at people for having such low epistemic standards. Like, there's a new theory of nutrition every two years, and people still put faith in them every time? Everybody swears by a different diet and this is common knowledge, but people still swear by diets? And the reasoning is that "fat" (the nutrient) has the same name as "fat" (the body part people are trying to get rid of)?

Then I encountered the "calories in = calories out" theory, which says that the only thing you need to do to lose weight is to make sure that you burn more calories than you eat.

And I thought to myself, "yeah, obviously.".

Because, you see, if the orthodox asserts X and the heterodox asserts Y, and the orthodox is dumb, then Y must be true!

Anyway, I hadn't thought about this belief in a while, but I randomly remembered it a few minutes ago, and as soon as I remembered its origins, I chucked it out the window.

Oops!

(PS: I wouldn't be flabbergasted if the belief turned out true anyway. But I've reverted my map from the "I know how the world is" state to the "I'm awaiting additional evidence" state.)

Load More