Archive for June, 2012

Original

Rules-of-thumb are handy, in that they let you use a solution you’ve figured out beforehand without having to take the time and effort to re-derive it in the heat of the moment. They may not apply in all situations, they may not provide the absolutely maximally best answer, but in situations where you have limited time to come up with an answer, they can certainly provide the best answer that it’s possible for you to come up with in the time you have to think about it.

I’m currently seeking fairly fundamental rules-of-thumb, which can serve as overall ethical guidelines, or even as the axioms for a full ethical system; and preferably ones that can pass at least the basic sniff-test of actually being usable in everyday life; so that I can compare them with each other, and try to figure out ahead of time whether any of them would work better than the others, either in specific sorts of situations or in general.

Here are a few examples of what I’m thinking of:

* Pacifism. Violence is bad, so never use violence. In game theory, this would be the ‘always cooperate’ strategy of the Iterated Prisoner’s Dilemma, and is the simplest strategy that satisfies the criteria of being ‘nice’.

* Zero-Aggression Principle. Do not /initiate/ violence, but if violence is used against you, act violently in self-defense. The foundation of many variations of libertarianism. In the IPD, this satisfies both the criteria of being ‘nice’ and being ‘retaliating’.

* Proportional Force. Aim for the least amount of violence to be done: “Avoid rather than check, check rather than harm…”. This meets being ‘nice’, ‘retaliating’, and in a certain sense, ‘forgiving’, for the IPD.

 

I’m hoping to learn of rules-of-thumb which are at least as useful as the ZAP; I know and respect certain people who base their own ethics on the ZAP, but reject the idea of proportional force, and am hoping to learn of additional alternatives so I can have a better idea of the range of available options.

Any suggestions?

Original

Does something like this seem to you to be a reasonable rule of thumb, for helping handle scope insensitivity to low probabilities?

There’s a roughly 30 to 35 out of a million chance that you will die on any given day; and so if I’m dealing with a probability of one in a million, then I ‘should’ spend 30 times as much time preparing for my imminent death within the next 24 hours as I do playing with the one-in-a-million shot. If it’s not worth spending 30 seconds preparing for dying within the next day, then I should spend less than one second dealing with that one-in-a-million shot.

Relatedly, can you think of a way to improve it, such as to make it more memorable? Are there any pre-existing references – not just to micromorts, but to comparing them to other probabilities – which I’ve missed?