September 9. I'm feeling uninspired this week, so I've gathered some one-liners that I've jotted (actually typed into Notepad++) over the last few months, while high:
Cannabis resets the kind of memory that causes boredom.
Ninety percent of wisdom is been-there-done-that.
Indecisiveness is grief: your options are your pets.
Anxiety and depression are disorders of attention.
A religion is a social organism that feeds on spiritual experience.
The presidential race is a reality TV show. They're all performers pretending to be authentic, and trying to avoid getting voted off.
Confidence is that which enables you to move on from mistakes as if you'd meant them.
September 19. A reader wants me to say more about anxiety and depression being disorders of attention. Of course that's not all they are -- sometimes there's actual brain damage. But I think a lot of us can go a long way toward mental health, just by practicing different habits of where and how we turn our attention.
Lately I've made some progress on managing anxiety, with a practice that I call expanding into pain. Every self-help guru will tell you, expansion is good and contraction is bad. What they don't tell you is what exact thing you're expanding, because it's really hard to explain. Another thing they don't tell you is that expansion feels terrible. If it felt good, we wouldn't have to be told to do it.
But for me, the pain is the key to the practice. I usually do it in the morning, when I'm still lying in bed, making the mental transition from the world of dreams to the world of earthly responsibilities. I'll be thinking about something that feels bad, and the practice is, never mind the thing, focus on the feeling, and amp it up, as strong as I can, as long as I can.
I'm sure a brain scan would reveal some action in the amygdala or wherever, but what it feels like, is that the world is made of needles and knives, and I'm expanding my astral body into them. I've started to call it my morning stretch. And after doing it enough, it becomes like a muscle that I can flex at will.
So if I'm out in the world, in some anxiety-causing situation (typically driving, which is so dangerous that if your attention lapses for half a second it can ruin your life) I can expand into it, and it's like the martial arts move, where someone throws a punch, and you move toward the punch, so that it hits you before it builds up any power.
Or it's like, anxiety is paying interest on pain, but if you catch it in time, you only have to pay the principal.
September 23. So the other day, after writing about pain, I started wondering about boredom. What exactly is it? Is it the opposite of pain, or another kind of pain?
Then I started thinking about attention again, and came up with this: boredom is the absence of anything that earns your attention; pain is the presence of something that demands your attention without earning it. So having to listen to your boring uncle at a family dinner is not actually boredom, but pain.
Now I'm thinking about attention as a dimension of power -- or really two dimensions. Power can force you to give attention you don't want to give, like ads, and it can give you attention you don't want, like surveillance.
Then I'm thinking, those two dimensions of attention can also make two different definitions of the self -- or two different things that the word "self" points to. The first is that you are a perspective which navigates a stream of experience. The other is that you are an object in other people's streams of experience.
This is not a new idea, and I'm not sure where I'm going with it. I just think it's strange that a concept as important as the self, which we think we understand, can point to two things, both based on attention, that don't overlap.
September 25. Depressing article, Public Opinion in Authoritarian States. The main idea: "for many of the most effective authoritarian systems, controlling the thoughts of the ruled is secondary to shaping social cleavages in the population."
Then it goes on to explain how ordinary humans do not choose their political positions out of rational thinking or even self-interest, but for social reasons: they want to believe the same stuff as their in-group, and the opposite of their out-group. And even in a supposed democracy, the ruling interests understand this and use it to control us.
October 2. Returning to the subject of attention, this subreddit thread has helped clarify my thinking, and now I can define four categories: 1) where your attention is, and you know it; 2) where your attention is, and you don't know it; 3) where your attention is not, but you know it could go there; 4) where your attention is not, and you don't know it can go there.
This is a lot like Donald Rumsfeld's speech about knowns and unknowns. He was talking in the context of war, and information technology has put us in the biggest attention war of all time. We are fighting for four things: to see, to not see, to be seen, and to not be seen. Turn the TV to the game, mute that ad, look at my tweet, and don't track me Google.
There's a lot to be said about being seen and not being seen, but I want to focus on seeing and not seeing -- especially not seeing. This is the age of raising awareness, and it's gone so far that we're overwhelmed. Our ancestors could have not imagined how many demands we have on our attention, or how hard it is to choose among them.
I think this is why some people are pushing back against mindfulness. The last thing we want is even more shit we're supposed to be paying attention to. But the way I see it, the mind is like a web browser, and mindfulness is like changing your preferences. It's difficult, but it's an investment: by giving some attention to your own filter, you can learn to filter more stuff out, and free up some attention for whatever you decide is important.
October 7-9. After the last post, I was surprised that no one challenged me on category 2, "where your attention is, and you don't know it." How is that even possible? Isn't that the definition of attention, that whatever your attention is on, you know it? Maybe it's like "Yeah, when I'm focusing on that thing, I'm aware of it, but I didn't notice I was focusing on it that much."
Two comments. From Voidgenesis on the subreddit:
This made me recall personal experiences of learning to play piano. My conscious awareness was mostly located in my dominant right hand. As I became more skilled and the left hand got involved it was as if someone else was controlling it much of the time. That in turn reminded me of all the neurobiology research showing that the mind is not a coherent construction, but composed of many different modules competing for access to the central self aware part (or frantic confabulator depending on your perspective). If attention is a neurological illusion then it tints the whole original conceptual framework.
And from Matt over email:
Perhaps the reason no one challenged your claim that attention can home in on something without us knowing it, is that people intuitively grasp how attention is more cloud-like than laser-like.
We can be thinking about an anxiety-inducing project at work, have a song stuck in our head, briefly be annoyed at another person on the train, and have a memory surface all within the space of seconds. It's easy to fail to realize that a part of our mind began replaying a song it heard from someone's smartphone before we boarded the train. We may suddenly wonder why we're thinking about so-and-so from college only to trace the memory to the fact that we've been replaying a song internally. We may or may not know why the song entered our thoughts at all.
If there's any activity that can be said to cause the most suffering, I'd say it's this: thinking about something without clearly knowing that you're thinking about it or knowing the negative effects that's having on your body.
October 11-14. Thanks Mark for sending this 2017 recording, The Hillbilly Sutra. It's a two hour talk by Mike Snider, better known as a banjo player than a spiritual teacher. But this is the most impressive spoken word recording I've ever heard. Usually when I listen to someone talking, I use the settings on YouTube or VLC to speed it up so I can get through the chatter to the interesting ideas. But with Snider, I actually slow it down so I can transcribe stuff like this:
Consciousness is the all. Besides it there is no other. So we are putting anything and everything under this umbrella. This is why I use the term absolute consciousness. This term refers to my beingness, and the selfsame beingness of not only myself but the singular all-encompassing and all-inclusive void beingness or intelligence of everything.
It's radiating from you right now, as you, right here in this moment. It is effulgent, visceral, radiant, and absolutely void of any objectivity or subjectivity whatsoever.
October 16. Bad News for the Highly Intelligent. Like a lot of studies of supposedly intelligent people, they just look at members of Mensa, which is not the same thing. Still, the results are extreme: double the anxiety disorders, and triple the environmental allergies of the general population. They speculate that more intelligent people are more overexcitable. Or...
Depressed People See the World More Realistically. The evidence from studies is not conclusive, but depressive realism fits my personal experience. I used to be happier, until three things made me smarter. First, I've been in enough car crashes now that I understand that driving is extremely dangerous and we should all be terrified every minute that we're doing it. Second, I've become a lot more aware of subtext in conversation, and now the social world feels like a minefield.
Third, I've heard that psychedelic drugs cure depression, and maybe I just need to take bigger doses, but the reason I'm suddenly cynical, is that last week I took LSD and walked up the river trail out of town. Every time I do it, it's pretty much the best day I've ever had, and I understand that every blade of grass is more impressive than the combined works of humanity. And then I have to go back to the human world and default human cognition. This song describes it perfectly.
Don't worry, I'm not considering suicide. But when I think about my own death, the main thing I feel is relief. Then when I think about it more carefully, I don't actually want to die, I just want to have no responsibilities. Don't we all -- and that's not normal. I remember in third grade when they taught us the word "responsibility", and I was immediately suspicious. Only now can I explain why: Responsibility is a social tool to maintain the inertia of activities that at one time someone felt like doing, but now nobody does.
October 18. Some feedback from the last post. First, some pessimism about the present society, a great 2018 blog post, There is Trouble in River City. The author uses two sources from the 1800's, Washington Irving's descriptions of two contrasting river towns, and Thomas Carlyle on "pig philosophy", to show how money can corrupt the human spirit, and how the thrill of material progress ends in malaise. I love this bit:
Irving had taken a steamboat up the Mississippi from New Orleans, had stopped at one of the "serene and dilapidated villages" that "border the rivers of ancient Louisiana," and had been there beguiled by the strangely joyous life of the tatterdemalion Creoles.
And a reader comment with some optimism about our species:
There are people who are trying to... evolve humanity on a spiritual level. Some call this the "5D reality" and some call it crystalline gateways, lol. Some major hoogey moogey there. But in my own meditations, it seems to resonate with the idea that we are capable of switching timelines, changing tracks, weaving in new threads entirely. I get the idea that the gods really love that shit. It feels GOOD. I think that's why we're still around.
October 21. To make laziness work for you, put some effort into it. It's a rumination around the issues of laziness, idleness, and boredom, and the main idea is that it's good to be less busy and appreciate it. There's even a bit about free will, which reminds me of the time Leigh Ann and I were driving past some wind turbines, which were barely moving, and she said, "Those windmills are lazy!" Maybe we're more like wind turbines than we think: our motivation seems to come from inside us, when really it comes from our environment, and how well it fits us.
October 31. The SpaceX Starship is a very big deal. It's fully reusable, it can take off and land vertically, and it's cheap.
The Starship is comparable in complexity to a 737, and so it's not unthinkable to have a construction rate of 500/year. If each Starship manages 300 flights per year, each carrying 150 T of cargo, then we are talking a yearly incremental cargo capacity growth of 22 million tonnes to orbit.
So, space factories, space hotels, satellites for all kinds of crazy shit, and I won't be surprised to see space advertisements outshining the stars. This Hacker News thread is mostly a debate about using space sunshades to fix the climate.
The article says that 90% of the cargo will be fuel for missions farther out. In sci-fi, these are human missions, but I think it will be almost entirely robots. Putting humans on Mars makes no sense economically, except as a way to take advantage of people who will do anything to go there. And when they get there, they will be so bored and homesick that they will do anything to come back. I think Philip K Dick nailed it: actual Mars colonists will stay sane by living in virtual worlds so convincing that they forget they're stuck on Mars.
More generally, a paradox: the deeper humans go into outer space, the deeper they will go into their own minds.
November 4. A few weeks ago I wrote that the thought of my own death gives me a sense of relief, because I would be free of all my responsibilities. This goes back to my favorite question lately: Why is there so little overlap between what's good for us to do, and what we feel like doing? Then I was reading Matthew Crawford's book Shop Class As Soulcraft and found a clue in this line: "We want to feel that our world is intelligible, so we can be responsible for it."
Coming at the same subject from another angle, a friend writes:
I am reading some family stories of my 92 year old neighbor, whose father and grandfather are prosperous farmers. They write a lot about the vital importance of being a prominent member of the local community. Not prominent in status or power, but prominent in the ability to help, to meet needs, in their local communities. It's their obligation, but also, their honor. They don't buy seeds from the fancy store far away for half the price; they buy seeds from the local guy, because that is what a community is all about.
Why did we stop doing that? I reject any kind of moral judgment, that people were better in the old days. People are the same as ever, we always do what seems like the best thing at the time, but our environment has changed so that abandoning local communities seems like the best move.
I blame technological complexity. In a hunter-gatherer tribe, or a medieval village, or even the USA a hundred years ago, the human-built world was intelligible to you and your friends. You could wrap your head around the importance of whatever you were doing, and if something went wrong, you knew someone who could fix it.
Now the human-built world is so complex that you can't possibly know enough people to stay on top of it. We have to constantly deal with specialists who we might never talk to again. And the specialists, even if they're doing something useful, are doing the same thing over and over for strangers, so they're not really into it. And at the same time, we're supposed to be super-nice to each other and pretend to be happy, which means hiding our gnawing awareness of how many things could go wrong, that we have no idea how to deal with.
Lots of people have written about the costs of complexity. Joseph Tainter's book The Collapse of Complex Societies is mostly about physical stuff rather than human psychology. However you frame it, three things are certain: 1) More complexity, more problems. 2) It's easy to gradually raise complexity, and really hard to gradually lower it. 3) So when complexity falls, it tends to fall a lot.
November 11. Speed limits for ships can have massive benefits. This reminds me of an argument by Ivan Illich, that the world would be a lot better with a universal speed limit of 15 miles per hour.
Here's my own page of excerpts from Ivan Illich on Cars, where he calculates that Americans put so much time into their cars, including working to pay for car expenses, that their effective speed is only 5 mph. Also, the walkability of a city is a key factor in the social mobility of its residents. And why are cars killing more and more pedestrians? Probably because cities are increasingly being built for more, faster, and heavier vehicles.
Of course, even moderate transportation reforms are politically impossible. So I might as well go full-on utopian. Instead of limiting speed, I would limit momentum: mass times velocity. Say, 2000 pound miles per hour, or 40 kilogram meters per second, roughly the momentum of an average sized person on a bicycle. Old and sick people could still putter around on electric wheelchairs. Shrink the roads to trails, abandon the sprawl, turn the parking lots to food forests or high density housing, turn the railroad tracks to intercity trails.
With no heavy freight, all manufacturing would be local, and smaller scale. Every city would look different because new construction would have to be made out of local materials. Food would be local, so some cities would really need to innovate with greenhouses or algae or solar-powered protein fabricators. (This is not a low-tech utopia, only a slow one.)
Air travel would switch completely to craft that are lighter than air, and you could actually go a lot faster than 15 mph, on an airship that follows the wind. Anyone who wanted to go east would be waiting for an east wind. (I sort of already do this. There's a town eight miles east of here, with a good bike trail, and I only go when there's an east wind, so that the ride home is effortless.)
Now, let's get really crazy and put a speed limit on data. I think the best rule would be that data can only be carried on physical media by human couriers. That could be enforced by cable-cutting, signal-jamming, and shooting down drones, and big systems would have no advantage over independent agents in motivating couriers to move fast.
The social effects of slowing data to human speed are really interesting, and too big for this post. But I think both the right and left would get on board if they thought it through. And it could become politically realistic inside a thousand years, especially if there's a backlash against big data.
November 14. Why Technologists Fail to Think of Moderation as a Virtue, a smart review of a new book about artificial intelligence. There's a famous thought experiment called the paper clip maximizer, and Elon Musk tells a version about a strawberry maximizer, an AI that eventually "blankets every nook and cranny of the planet with strawberry fields and annihilates civilization in the process."
But the review cites sci-fi author Ted Chiang, who says the tech executives are projecting their own value system, every company trying to maximize growth at all costs. We're worried about computers, when corporations are already powerful and dangerous artificial beings.
I would argue it like this: an actual strawberry maximizer is almost harmless, because we understand what it's doing. It has a clear and simple motive, and the moment it destroys something we care about, in order to grow more strawberries, we'll stop it. But nobody understands what Google is doing, not even the people who work for it. It doesn't even make sense to say Google has a motive. It has behaviors, and consequences, and its behaviors are tied in with our own interests, so that it can do a lot of damage before we get serious about stopping it.
You could say the same thing about automobile traffic. A human creation has duplicated itself so successfully, that you can find it everywhere there are roads. Whole cities are made for it. It has made us completely dependent on it, while violently killing us, making us sick, isolating us socially, consuming massive resources, and throwing our planet's climate out of whack. Strangely, no one loves this radical contraption more than self-described conservatives (who have also found the one good use for it: racing).
But we don't call it an AI, because it's not intelligent in any way that we recognize. Its intelligence is hidden inside us. Our own simple motive, to go faster, has tricked us into doing all the thinking for a global takeover by an evil robot army.
Related: a blog post by Adam Elkus, The Varieties Of The Technological Control Problem. Near the end, he uses the movie Ex Machina to illustrate how our relationship with technology is like an abusive couple:
All abusive relationships begin between two individuals that believe they are both in love and that they can meet each other's varied needs. But over time several negative things occur. First, the parameters of the relationship are subtly changed to the disadvantage of one of the parties. Second, that party becomes less and less capable of recognizing what is happening to them and breaking free of the abuser. So perfectly independent and emotionally stable men and women can in theory become shells of their former selves after being trapped in an inescapable web of abuse that, sadly, they come to believe that they deserve. This is a good metaphor for [Langdon] Winner's own formulation of the control problem. Technology can be "autonomous" in the sense that humans may enter into relationships with technologies with the goal of improving their lives. However in reality people end up finding themselves hopelessly dependent on the technologies and deprived of any meaningful control or agency over them.
November 18. Fascinating article from the Guardian, I wish I'd never been born: the rise of the anti-natalists. There are different levels of anti-natalism, which I would define by how far people go in projecting their own unhappiness onto others. The first level is just not having kids of your own, because you think they're better off not existing. The next level is wanting all of humanity to go extinct. The next level is the belief that "all sentient beings should be spared from life."
Never mind the difficulty of defining "sentient", I'd like to hear a debate between a full-on anti-natalist and a squirrel. I would say, isn't life full of good feelings that make the bad feelings worthwhile? The anti-natalist would say, those are two different orders of feeling, you can't compare them, and any amount of pain makes existence a bad idea. The squirrel would say the opposite: life contains a lot of pain, but it's all part of the joy of being alive.
So the real cause of anti-natalism isn't pain, but the absence of pleasure. Now we're tripping over language again, because "pleasure" implies stuff like eating ice cream and watching Netflix, which are overwhelmed by serious depression and anxiety. We don't have a word for the kind of good feeling that would overwhelm common bad feelings, because that kind of good feeling is not part of our way of life.