In the original thread for the Order, I mentioned how, in my experience, people in internet debates usually just threw arguments at each other without changing their minds. I'm certainly guilty of that sometimes - I'm getting way too emotional about the global warming issue. I also mentioned I thought resolving these problems is very important, and might be possible. So let's bring in the big guns! I think (I'm only an amateur logician myself) that this is called an expected utility argument.
"Expected utility" is a method for calculating how good certain choices are. For example, you're in a casino with two slot machines. On Machine 1, you have a 1/100 chance of winning $100. Machine 2 gives a 1/1000 chance of winning $500. If you want the best odds of striking it rich, which machine should you play?
What if you play each machine a thousand times? On Machine 1, you'll win an average of ten times (1000 x 1/100), win $100 each time, and end up with $1000. On Machine 2, you'll win an average of 1 time, (1000 x 1/1000) and win $500. So, on average, you'll win twice as much money with Machine 1. This is true whether you play the machines 10 times, 1000 times or 1000000 times.
What if Machine 1 costs $2 per play, and Machine 2 costs only $1 to play? Let's try them a thousand times again. Machine 1 costs $2000 dollars and pays $1000. In the end, you lose $1000. Machine 2 costs $1000, and pays $500. In the end, you lose $500. So in this new case, playing Machine 2 is a better choice than playing Machine 1. Not playing either machine, you lose an average of $0 and win an average of $0, so clearly not playing either machine is the best idea of all.
We can generalize this strategy to find a way to decide between any two slot machines. Take the chance of winning times the payoff, and subtract the cost. This is the average dollar amount you can expect to win each time you pull the lever. This gives an easy way to decide which of any two slot machines is better. If it's negative, it also means that not gambling at all will, on average, give you a better result than using the slot machine. Of course, all real slot machines are negative - if they weren't, the casino would go broke.
But this doesn't just work for slot machines. It also works for all other games of luck - the lottery, roulette, et cetera. With a slight modification, it can be extended to games of skill. Consider a very simple blackjack game - we each bet $25, play one round against each other, and whoever wins keeps the $50. I'm twice as good at blackjack as you are, so there's a 2/3 chance of me winning (let's ignore luck for now). The expected utility for me of playing blackjack is now ($50 x 2/3) - 25 = 8 1/3. So on average, I win $8.33 each time we play - it's a good deal for me!
But this doesn't just work for games! It also works for almost any decision. How about this one? There's this really cute girl. I want to ask her out. If she says yes, I will be very, very happy. If she says no, I'll be really embarrassed and disappointed. How do I decide? First I quantify happiness. Maybe I predict I'll be really really happy if she says yes, but only moderately sad if she says no. I decide I'll be ten times as happy if she says yes as I will be sad if she says no. So we have 10 Happiness Units and -1 Happiness Unit respectively. It doesn't matter what the heck a Happiness Unit is, as long as I'm using it consistently.
I obviously don't know if she'll say yes or not. But I have a pretty good idea. If I'm rich and handsome, and she's desperate, there's probably a very high chance, around 9/10. If she knows I'm the sort of person who posts long essays about logic in micronational forums, there's a very low chance, around 1/10. Let's say, after considering my attractiveness level and her tastes, I estimate the probability of her saying yes at 1/4. So chance of success is clearly 1/4. And the payoff is clearly the 10 Happiness Units I mentioned. But what's the cost? The cost is the embarrassment and disappointment I get when she says no. There's a 3/4 chance of her saying no, so there's a 3/4 chance I'll have to deal with a cost of 1 Happiness Units. Plug that into the Expected Utility equation and we get (10 x 1/4) - (1 x 3/4) = 1 3/4. So on average, I gain 1.75 Happiness Units by asking her out. We don't know what the heck a Happiness Unit is, but since it's positive here, we know I'll, on average, be happier after asking her out. So our decision is...I should ask her out!
Playing around with the numbers shows that the equation fits common sense pretty well. For example, if there's a zero percent chance she'll say yes, the equation becomes (10 x 0) - (1 x 1) = -1. The Happiness Units are negative, and therefore I shouldn't ask her out - of course I shouldn't, I know she'll say no!
This equation does lead to one surprising result, though. It says I should ask her out...but we know that three times out of four, that decision will only lead to pain and suffering. What's up with that? Well, consider a lottery. Tickets cost $1. There's a 1/4 chance of winning. If you win, you get $1 million. I'd KILL for one of those lottery tickets, even though I know that three times out of four, it's a waste of a dollar. Clearly, whether a decision is good depends both on your chances of payoff AND on how big the payoff and the cost are.
Now we finally get to global warming! Yay! Consider any proposed solution to global warming - let's take the Kyoto Protocol, it seems pretty popular. Now we have a good mathematical way to determine whether the Kyoto Protocol is a good idea.
Take the chance global warming's manmade. Multiply that by the benefits of implementing the Kyoto Protocol. Subtract the cost of the Protocol. That's your expected happiness after implementing the Kyoto Protocol. (I'm starting to realize I've done this all very inelegantly and the signs are messed up, but too late now.) But the benefit of the Kyoto Protocol is not having global warming happen...so it's equal to whatever the cost of global warming would've been. The cost of the Kyoto Protocol is the economic cost of closing down polluting factories and that sort of thing.
So...if (Probability that global warming is real x Global warming's costs) - Cost of Kyoto Protocol > 0, we should implement the Kyoto Protocol. Or, since seeing it look all nice and proper gives me a warm glow, (P x G) - C > 0 implies Kyoto is a good idea.
P, the probability that global warming is manmade, is the most complicated part. The best way to do it would probably be a prediction market like InTrade. I should really explain that more, but this post is already too long (that always happens!) Another good way would be looking at the percentage of intelligent scientists who believe each view.
G is the cost of global warming IF it really exists. We could try giving a cost in dollars, based on how much property would be destroyed by flooding, extra strong hurricanes, and stuff like that, and adjusting for the cases when global warming would actually create value - the new tropical vacation paradises in Canada
C is the cost of implementing the Kyoto Protocol. Nice and straightforward - a few hundred billion dollars in lost economic growth because there aren't enough coal plants to power factories and that sort of thing. I'm sure someone's done the calculations, so again, let's look for the least biased group. There might also be some human suffering involved here too, so convert it all into utils and you've got utils on both sides of the equation.
My question for everyone involved in discussing global warming on the other thread is: do you think this will work? If so, let's - all of us who are interested - try to find numbers for all three of these variables. Let's plug them into the equation and see whether the Kyoto Protocol is a good or bad idea. If the equation says it's a bad idea, I promise to very seriously reconsider my support. If the equation says it's a good idea, I'd hope you would do the same thing. Are you game?
