Support This Blog On Patreon!

Support this Blog!

All I ask is a $1 a month. (But more is great too.) If you find this content to be beneficial, interesting or just a fascinating peek into true insanity please donate.

Saturday, January 13, 2018

Commentary on Moloch (Now With More Fermi's Paradox)

If you prefer to listen rather than read, this blog is available as a podcast here. Or if you want to listen to just this post:



Or download the MP3



As I mentioned a few weeks ago, in addition to recording my own blog posts and turning them into a podcast, I started doing the same thing with posts from SlateStarCodex, Scott Alexander’s fantastic blog. I would again repeat that if you like SSC and prefer to listen to your content rather than reading it, you should check it out. When I announced the SSC podcast several people requested that I record some of the older SSC posts, the classics if you will, so this week I decided to record his Meditations on Moloch post. And I figured as long as I was doing that, I might as well provide some commentary on it. Because while I think Alexander is largely right on the money, I don’t think he goes far enough, or maybe it’s fairer to say, in my opinion he misses some of the implications.


Of course Alexander’s post is nearly 15,000 words, and my posts are generally around 3,500 words so I don’t know how much of his epic post I’ll be able to cover, but I think there are several ways in which his post ties into themes and subjects I’m interested in, specifically technology, religion, and Fermi’s Paradox, so I’ll be highlighting those connections. But before reading my take on things, I would urge you to read the original post or listen to my recording of it. The name of the post, comes from Allen Ginsberg’s poem Howl, particularly part II, consequently Alexander makes extensive reference to the poem and I’ve used a recording of one of Ginsberg’s recitations of Howl every time Alexander quotes from it. (Which I personally think is super cool.)


For those of you who can’t be bothered, or don’t have the time to read or listen to the original (and as I said it is long) I’ll give a very brief summation of (what I believe to be) Alexander’s point:


The idea of Moloch is the idea of the race to the bottom, and Alexander gives over a dozen examples of it in action, but the one I want to borrow is his analogy of the rats and the island.


Suppose you are one of the first rats introduced onto a pristine island. It is full of yummy plants and you live an idyllic life lounging about, eating, and composing great works of art (you’re one of those rats from The Rats of NIMH).


You live a long life, mate, and have a dozen children. All of them have a dozen children, and so on. In a couple generations, the island has ten thousand rats and has reached its carrying capacity. Now there’s not enough food and space to go around, and a certain percent of each new generation dies in order to keep the population steady at ten thousand.


A certain sect of rats abandons art in order to devote more of their time to scrounging for survival. Each generation, a bit less of this sect dies than members of the mainstream, until after a while, no rat composes any art at all, and any sect of rats who try to bring it back will go extinct within a few generations.


He offers the rat example in the Malthusian Trap section, and in this case it’s a race to the bottom for survival, but you can see a similar race to the bottom with capitalism and profits, democracy and electability, and essentially any system with multiple actors and limited resources. Alexander groups all of these drives to optimize one factor at the expense of others under the heading of Moloch. And makes the, not unjustified claim, that unless we can figure out some way to defeat Moloch that it will eventually mean the end of civilization. And by this he means the end of art, and literature, and science and love. Because just like the rats, in the end those aren’t any of the factors we’re optimizing for.


Presumably very few of my readers will outright disagree with the idea of Moloch, that evolution, and capitalism, and politics are predisposed to engage in a race to the bottom. They will only disagree with how much of a factor it is in this enlightened age, with many granting the existence of Moloch, but convinced that he has already been well and truly beaten. Others, in my opinion, the more realistic, believe that the contest is as yet undecided. And finally there are certainly people who believe that we are destined to lose if we haven’t lost already.


The difference between these groups hinges almost entirely on their views of technology. With the first group feeling that technology has allowed us to progress past things like Malthusian Limits. We may be rats on an island, but technology allows us to turn more of the island into arable land, to build houses on the slopes of the extinct volcano, and best of all, if necessary, to build ships and find new islands.


People in the middle group agree with the benefits of technology, and agree that when the rats first arrived on the island that things were pretty awesome, but they understand that other islands are really hard to get to, and that the current island is not looking so great. Also that while some rats have it pretty good, that there are a lot of rats who already have a sad, miserable life.


In the final group we have people who are convinced that we’ve already wrecked the island we’re on, and that the other islands are horrible inhospitable wastelands. They probably also believe that we’re already on the verge of collapse and we just don’t know it. And, that sure, collapse won’t kill all the rats, but it will leave those that survive envying the dead.


I’m in the middle group, maybe leaning towards the final group, mostly because the majority of rats seem completely unaware of Moloch, and the race to the bottom. But also because I think technology could fail to save use by not being powerful enough, but also by being too powerful. And this is where things like the singularity and in particular artificial intelligence come into play.


On these points Alexander and I are largely in agreement, and at this stage I mostly want to point out how the idea of Moloch ties into the theme of this blog. I claim the summer is past and the harvest is over, but this presupposes that there was a summer and a harvest and also explains why it must eventually end, not because I’m a pessimist, but because there is a implacable force driving things in that direction, Moloch.  Alexander acknowledges the summer and the harvest as well and ties it into Robin Hanson’s idea of Dream Time, the idea that humanity has never believed in more crazy non-adaptive things. In other words they’ve never been less worried about Moloch. Why should this be?


Alexander offers four things that are currently keeping Moloch at bay, and allowing people to engage in a host of activities which don’t have much to do with daily survival.


Excess resources: When the rats first arrive on the island, they don’t need to worry about survival because they have a whole island to themselves. To tie this in to a point I’ve made in the past, it was not that long ago that we discovered the “island” of millions and millions of years worth of stored solar energy, in the form of fossil fuels. And we have kept Moloch at bay through the excess resources of coal, oil and gas, ie cheap and abundant energy, but as I pointed out in a previous post, even if you don’t believe in peak oil, or even if you’re unconcerned by global warming, at some point continued growth runs into hard limits built into the laws of physics.


Physical Limitations: Certain human values, like sleep, have to be preserved because optimizing productivity requires optimizing sleep, or at least not ignoring it. But what if we can use technology to get around physical limitations like the need for sleep? There’s been a lot of worry recently about whether robots or AI will be taking jobs, another thing I have written about in the past. As an example, just today I saw an article in Slate Jack in the Box CEO Says It "Makes Sense" to Consider Replacing Human Cashiers with Robots If Wages Rise. Within the article, there’s the vague hint of disapproval (like the scare quotes around “makes sense”) but if Jack in the Box doesn’t do it, someone else will, and I predict that the time is not that far distant where it won’t take increased wages to make robotic cashiers competitive, they’ll be dramatically cheaper almost regardless of what you pay the human cashiers.


Utility Maximization: This is just Alexander’s way of saying that human values and some of the systems we rely on are currently well aligned. That one of the reasons capitalism and democracy have done a pretty good job, despite being built such that they will eventually descend to the lowest common denominator, is because, for the moment, both are mostly aligned with genuine human values. Customer and employee satisfaction in the case of capitalism and voter satisfaction in the case of democracy. But Alexander points out that this is a temporary alliance and is more likely to be broken by technology than strengthened. (See robots above.)


On this topic Alexander made one comment that really jumped out at me and which I want to pay particular attention to. I don’t know how much this is related to technology, but it is something I’ve been more and more worried about. As I said democracies are almost custom built to engage in a race to the bottom. Partially this is prevented by the need to align themselves with voter satisfaction, and partially this problem is solved, at least in the US, by as Alexander points out:


having multiple levels of government, unbreakable constitutional laws, checks and balances between different branches, and a couple of other hacks.


I entirely agree with this statement, but I also think that the checks put in by the founders to prevent a race to the bottom have been significantly weakened over the last several decades. I talked about DACA/Dreamers a few months ago, and I just saw that a federal judge ordered the Trump administration to partially revive it. We can argue about the morality and ultimate wisdom of DACA all day long, but it’s hard to see where the judiciary ordering the executive to reimplement legislation the legislature failed to pass is anything other than a gross erosion of the checks Alexander mentions.


Returning to the list, the final thing Alexander lists as something which keeps Moloch at bay is coordination. As an example, the rats could all agree to limit the number of children they have. Corporations could all agree to only make people work 40 hours a week (or more likely be forced to do so by the government). And, in theory, under the rule of law, we all agree to abide by certain rules. Alexander repeatedly talks about how things are solvable from a god’s-eye view, but not by individual actors. Coordination allows people access to this god’s-eye view.


In Alexander’s view coordination is the only thing which is a pure weapon in the fight against Moloch. But, as we’re still on the subject of technology, does it increase coordination or does it undermine it? At first glance most people, including Alexander feel that it will increase the ability to coordinate, and certainly there are lots of reasons for thinking this. Better communication being the primary one. But remember the original post was written in 2014. Now that we’re in 2018 and you can look back over the last few years, and the increasingly fractured landscape, are you sure that technology is going to, on net, improve coordination?


Recall that during World War II the Soviet Union was coordinated enough to sacrifice 13.7% of it’s population in a coordinated attempt to defeat Nazi Germany (who were able to sacrifice ~8.5% of their people in a coordinated attempt to do the opposite.) Does anyone feel like we could achieve that level of coordination in modern America? You may argue that that sort of coordination was the bad sort, maybe, the outcome was certainly bad, but does anyone doubt both countries were more coordinated than we are by any measurement? Or you may argue that given there is no Nazi Germany, it’s not necessary for us to possess the coordination of a Soviet Union. This is a good point if the only time we need to coordinate is when we’re in conflict with other nations, but what if we need to coordinate to avoid overfishing, or stem global warming, or to put a colony on Mars?


It occurs to me, that like many things coordination may have a sweet spot, to little and a race to the bottom is unavoidable, to much you create a fractured society of hundreds of ideological groups composed of perfectly coordinated individuals who refuse any level of coordination with other groups.


Without coordination, the problem is that you have millions of individual actors, all maximizing their own welfare at the expensive of the commons. With perfect coordination you have thousands of ideological clusters which end up being more powerful than the individuals they’ve replaced, but no less selfish. Or to put it another way, we’ve moved from the individual to the alt-right and the antifa, but in between those two points we were all just Americans. Which was better. There was, perhaps, less true coordination, but we were more effective (witness our contributions to World War II).


Moving on from technology, the next topic on my list is religion, which is, of course, another very effective method for coordination, but which has lately fallen into disfavor. Alexander doesn’t spend much time specifically discussing religion except to lump it in with the other things that assist in coordination, like traditions, social codes, corporations, governments, etc. But in another post he makes a strong case for religion being the very best of all coordinating institutions. Which ties into a point I frequently bring up: Religion is more important and useful and antifragile than the non-religious (and even some of the religious) realize even in the absence of God. Coordination is just one more example of that.


Alexander brings up another example when he talks about the controversial topic of historical patriarchy and the tradition that women should stay home and bear children. Like most people these days he’s against it, but he brings up the point that it does make a society more resistant to Moloch. (He actually uses the phrase gnon-resistant, but we don’t have the space to get into what gnon is.) And whatever your opinion of it, this is something most religions emphasis, and I offer it up as one more piece of evidence that these beliefs came about because they were useful, not because everyone in the past was a horrible sexist bigot. Whether they continue to be useful is a different topic.


So these are a few of the benefits of religion even in the absence of God, but what if we bring God back into the picture? What does a discussion of Moloch say about whether God exists? In the original post, Alexander repeatedly talks about terrible problems, which disappear in the presence of a “god’s-eye-view”. It’s a phrase he uses repeatedly (17 times, actually.) Thus the obvious solution for Alexander is to build a god that can “kill Moloch dead.” In Alexander’s estimation our best chance at creating this god, is to build a friendly, superintelligent AI. This gives us a god which will not only allow us to engage in perfect coordination, but endow us with infinite resources, remove all physical limitations, and create systems where incentives are perfectly aligned. That is certainly one plan.


Another plan is to hope such a God already exists. (Or to exercise something very similar to hope, faith.) In other words, Alexander’s plan is to hope that we can create a friendly god. Those who are religious have faith that such a God already exists. Is one strategy really that superior to the other. And is there any reason why you wouldn’t hope for both?


If someone is going to place their faith in Alexander’s plan it’s reasonable to ask how likely it is. Well, let’s take a moment to discuss it. Alexander’s god doesn’t get us away from maximization, just as the rats have to maximize for survival, and capitalists have to maximize for profit, the superintelligence will end up having to maximize for something as well. If we want to know how much hope we should have, we need to know what it’s likely to maximize. The cautionary example that most people are familiar with is the paperclip maximizer, which stands in for all sorts of potential maximization. In this specific example the AI just happens to be programmed to produce paper clips, and ends up turning all available matter into paper clips. Of course, the example doesn’t have to be as silly as paper clips. Even if we tell the AI to optimize for intelligence that could still entail turning all available matter into computers, which is less ridiculous than paper clips, but no less deadly for us. Or we could tell it to optimize our happiness, which may just result in the AI plugging an electrode into our pleasure center. A process AI researchers call wire-heading. Viola! Maximum happiness, and who cares if you’re indistinguishable from a heroin addict. This is just a small taste of the difficulties we face in implementing Alexander’s plan, difficulties which have had whole books written about them. (For example Bostrom’s Superintelligence which Alexander references frequently in the original post.)


As an aside, you may at this point be wondering, if everything has to maximize for something what does religion say that God maximizes for? Well I only feel competent to speak about Mormonism, but on that count we know the answer. It comes in Moses 1:39:


For behold, this is my work and my glory—to bring to pass the immortality and eternal life of man.


This sounds like it would align pretty well with human values.


To return to Alexander’s plan, as you might imagine there are a lot of challenges when you decide to build a god. But what about the other plan: religious faith?


Many people will complain that there’s very little evidence for God (to be fair that may be why they call it faith...) and that there are several problems, not the least of which is the problem of evil and suffering. In a marvelous piece of serendipity I believe research into artificial intelligence has given us a great answer to those problems (an answer Alexander himself expressed some admiration for.) But I have yet to talk about Fermi’s Paradox, and I think within the original post we have yet another reason to believe there might be a God.


As Alexander describes Moloch there seems to be no reason why he hasn’t swallowed the universe. In fact Alexander says:


This is the ultimate trap, the trap that catches the universe. Everything except the one thing being maximized is destroyed utterly in pursuit of the single goal, including all the silly human values.


Oh, wait, that’s not Moloch, that’s what happens if we get Alexander’s AI god wrong. But I suppose that’s one form Moloch could take. But he also says when speaking of walled gardens:


Do you really think your walled garden will be able to ride this out?
Hint: is it part of the cosmos?
Yeah, you’re kind of screwed.


But if this is true, if Moloch will eat everything in its path, why hasn’t it already? Why haven’t we been destroyed by the Berserkers of Fred Saberhagen? Turned into computronium by alien AIs? Been enslaved by Kang and Kodos? If your argument is that interstellar travel is hard than why haven’t we seen evidence of the Moloch-style process of creating a Dyson sphere? Or given the prominent place memes have in Alexander’s post, why haven’t any infectious messages been broadcast at us?


As usual, it’s always possible that we are entirely alone. Which I think leads to the worrying conclusion that Moloch is a strictly human creation… It’s also possible that the “Gardener over the entire universe” Alexander says we need, already exists, and our task is to figure out what he wants from us.


Finally, I think Alexander and I both agree that the harvest is past and the summer is ended, and that the path to salvation is a narrow one. He pins his hopes on transhumanism and a friendly AI. I’m pinning my hopes on the existence of God. If you think I’m silly, that’s fine. If you think Alexander is silly, that’s fine. If you think both of us are silly, well then, are you sure you’re not worshipping Moloch without even realizing it?





If you don’t think I’m silly, or if you do, but you find it amusing, consider donating.

3 comments:

  1. "He offers the rat example in the Malthusian Trap section, and in this case it’s a race to the bottom for survival, but you can see a similar race to the bottom with capitalism and profits, democracy and electability, and essentially any system with multiple actors and limited resources. Alexander groups all of these drives to optimize one factor at the expense of others under the heading of Moloch. And makes the, not unjustified claim, that unless we can figure out some way to defeat Moloch that it will eventually mean the end of civilization"

    Will have to read more but my first reactions:

    1. There's a paradise myth built into the hypothetical. A few rats get loose on an abundant island full of food and easy living. This fits the Garden of Eden myth but everything we know about human history and development says the opposite.

    2. Optimizing one factor usually ignores diminishing returns. Factors one might want in a car might be fuel economy, plenty of room, crash protection, rapid acceleration. Optimizing one of these factors, though, quickly runs into diminishing returns. Race cars are great at rapid acceleration but are essentially undrivable on anything but the specialized racetracks they are meant for. If you're a body builder and you keep working your left arm, you're going to start looking silly. The return on doing more left arm work goes down but the return on even a slight bit of right arm work to even things out a bit goes way up.

    Reality is the 'unspoiled island' follows those rules. The fruit that grows does so in balance with the fruit that is rotting, the insects feeding on the plants and each other. Malthus might be seen as a trap but it might also be viewed as sitting on top of a mountain, remain balanced and you'll be fine but all other options go straight down.

    "Returning to the list, the final thing Alexander lists as something which keeps Moloch at bay is coordination. As an example, the rats could all agree to limit the number of children they have. Corporations could all agree to only make people work 40 hours a week"

    yea but marginal returns already does this. Even absent time and a half laws, asking workers to work more hours means you either would pay them more or have to hire lower quality workers. Most businesses could stay open 24-7 legally but many do not because costs for staying open longer go up fast while benefits start dropping.

    "But in another post he makes a strong case for religion being the very best of all coordinating institutions. "

    Or perhaps this is just survivorship bias. Like corporations if you define religion specific enough faiths pop up and vanish all the time...only a few last a long time but that's to be expected just from the luck of the draw...gets even worse if we start to look too carefully at long surviving faiths. Many might say the Roman Catholic Church pre-Vatican II versus today is so radically different it might as well be a different religion that never changed its name. Some say the LDS Church of today is not the 'true' one Joseph Smith founded.

    "Alexander brings up another example when he talks about the controversial topic of historical patriarchy and the tradition that women should stay home and bear children."

    Historical? Historically most people 'worked' at home or wherever the hunter-gathering group set up shop today. The idea that 'normal' family life is one, let alone both, family members going off to a distant workplace with scores of unrelated people is actually pretty new to the scene...an invention mostly of the Industrial Revolution which needed huge factories.

    ReplyDelete
    Replies
    1. 1- We are in such a paradise right now, the paradise of cheap energy and low-hanging technology. It might not be quite as obvious as the island, but it's very similar.
      2- I don't know that diminishing returns are being ignored, but if you're really in a race, you can't just ignore optimization because additional efforts will mean diminishing returns. Speaking of cars if you look at actual races, pit stop speed becomes incredibly important, and once everyone has the same quality of tires, and once you've figured out the best fueling strategy it all comes down to getting the speed lower and lower. And yeah each additional 0.1 of a second is 10x harder than the last one, but they still do it.

      As far as religions it could be the opposite, those that survive did so because they created the conditions to survive, not just they happen to be the ones that are still around. I think my Catholic Priest friend would strenuously argue that post Vatican II catholicism is a totally different religion.

      Finally even in hunter gatherer societies the women still assumed the majority of childcare duties, so while the exact division of things in say the 50s is recent, the general division is very old.

      Delete
  2. Going in reverse order:

    Yes but working with the tribe is not the same as home based production is also not the same as the Industrial Revolution model of mass male absence from the home for most of the week. You can construct analogies. A guy might imagine he is going on a 'hunt' with his buddies when the reality is he is going to work 8 hours at a factory five days a week but those break down quickly. While some epic hunts could last days, hunting was not an 8 hour times five day affair. Women didn't 'stay at hut' during hunts but worked collectively with other women and children. The nuclear model where the 8 hour work day consists of a distant workplace and a huge home filled with so many things that they demand equal solitary 'housework' is downright bizarre and strange historically.

    Possibly although in the case of Catholicism and Mormonism we have pre-existing environments filled with religion. This might be like asking did Sears create catalog shopping and Amazon online retailing or were they just the companies that succeeded at it enough to dominate the market? I could construct counterfactual histories in both directions but I don't think we have any way to objectively test them.

    "low-hanging technology" isn't a resource in the way of the island analogy. The island naturally produces so much fruit per year which can fill a certain population of rats. Beyond that the rats will start to starve and competition for food will make for nasty living among the rats. Technology might be an observation that if the rats kill a certain type of insect, the island will yield 20% more fruit. That pushes Moloch off but unlike devouring a stockpile, it doesn't 'run out'. Once known that 'trick' can always be used to keep fruit production higher than it would have been.

    Cheap energy does apply in terms of nonrenewable energy. Here marginal costs have already demonstrated their effectiveness. When England exhausted its wood, coal started being used. As coal got harder to get, it found colonies. The US depleted many forests in the east but now they have returned because few use wood for power anymore even though they theoretically could.

    If we are talking deep time, energy doesn't seem as critical. Between existing stocks and untapped options (turning to solar, wind, even deep geothermal) being able to produce a good amount of energy for a huge population for a huge amount of future time is quite feasible. It's also not clear to me that we need to keep increasing the amount of energy *per person*. A lot of new types of consumption seem to actually lower energy use per person. For example, it takes less energy to do a face to face video chat with someone on the other side of the world than it does to fly them together for a conference. I've read if you converted all cars to self-driving and went to a time-share model (your car drives you to work then goes off and gives other rides before returning to get you in the evening) the actual # of cars drops by nearly 80%. It does not seem obvious to me that 500 years from now every person would need a gigawatt personal nuclear plant to power their needs.

    ReplyDelete