Most everybody knows what inflation is or, at least, has felt the effects of inflation. It’s when there is an oversupply of money which devalues the currency and makes prices go up (I know I’m way oversimplifying. There’s no need for a detailed theory of inflation to make my point, though.). I just read an article by John Danaher over at Philosophical Disquisitions called “Artificial General Intelligence and the Problem of Cognitive Inflation“. Basically, Danaher says that intelligence can act like money. As we develop AI, we increase the total amount of intelligence out there. This devalues each unit of intelligence. The main way we’d notice is through work. Jobs that require intelligence (in other words, all jobs) will be less valuable, pay less money, than they were before. It’s an interesting position, and I don’t know what I think of it yet, but it got me thinking. If there can be monetary inflation and cognitive inflation, might there be other types of inflation?
Going with Danaher’s theory, anything that can increase its supply and that has value can be inflated. We see it fairly often with fads. The Beanie Baby craze from 20ish years ago is a prime example. When first released, Beanie Babies were relatively rare. When people started wanting them, the price went way up, so they started making more Beanie Babies. After a while, they were everywhere. Even McDonalds was selling Beanie Babies. As the market flooded with these toys, the price crashed. Each Beanie Baby was less valuable, or, put another way, it took more Beanie Babies to command the same amount of money. A lot of people were victims of Beanie Baby inflation.
Of course, Beanie Babies are not really the type of thing I’m talking about here as they were still an economic type of inflation. I’m thinking of things like intelligence that aren’t inherently market driven. One area that I thought of as a possibility is emotional inflation. Emotions clearly have value. The question is whether we can increase the supply of emotions in any meaningful way. The easiest way to do that right now is to have more babies. But that increases the supply of emotions too slowly. It’s a way to increase intelligence, too, but Danaher looks specifically at AI because that is the way we’re going to increase intelligence the fastest.
So, will we ever get to a point where we are creating artificial emotions (AE?)? It seems like that would be the Holy Grail of the companion robot industry. People already become emotionally attached to machines, but what would happen if those machines, robots, digital pets, cars, etc., loved us back. Would that cause emotional inflation? Would the value of love, and other emotions, diminish. Would it start taking more love to get the same effect? I don’t know.
Another area that I thought of that could be ripe for inflation is sentience. It would work in the same way as cognitive inflation or emotional inflation. If we were to build sentient machines, would that devalue sentience? Again, I don’t know.
However, I wonder if we might already be in the midst of sentience inflation. That isn’t because we’re creating new sentience, but because we’re discovering new sentience. Scientists are discovering that almost all life is sentient. Animals from humans to insects are sentient. Some think that plants and even fungi are sentient.
The people who advance these theories usually use them to try to bolster the value of non-human life. I can’t help but wonder if that might backfire. It’s not like these discoveries of animal sentience are converting people to veganism en masse. Even people who are well aware of the research showing the rich emotional lives of animals don’t flock to veganism. Could it be that all this newfound sentience is devaluing sentience rather than increasing the value of non-human life? Does it take more sentience now than it used to to be welcomed to the self-aware-so-we-won’t-eat-you party? Once again, I don’t know. But I feel like there might be something to this. People know about the sentience of other animals, but they don’t care because sentience, by itself, is no longer very valuable. We have undergone sentience inflation.
Like I said at the beginning, I don’t really know what to make of all this yet. I do find it intriguing, though. Can you think of any other candidates for non-monetary inflation? I’d be curious to hear them.