A Little Envy Can be a Good Thing (in Humans and Machines)

When most of us scroll through social media and inevitably compare ourselves to those around us we feel crappier, like we’re missing out or falling behind in our personal lives or work.  Not coincidentally, there’s also a growing understanding of well-being and happiness as subjective and adaptive: your happiness largely depends on your expectations. Your expectations adapt, however, and not only to your conditions, but to the conditions of those around you.

You probably thought your drawing in 5th grade was just fine until you saw Linda’s. And that’s also why as people get wealthier they aren’t necessarily happier – the comparisons and expectations keep changing – first you want the house, then the yacht, the island, a political office, then maybe a planet (close by). It’s easy to imagine then that for all of us being exposed to so many people’s lives exposes us to all sorts of conditions that appear in some way better than our own, setting our own expectations higher, and increasing the likelihood for unhappiness.

Social Comparison

There’s a lot understood about what actually matters for being happy, both with your life and in your life – social connections, time meaningfully spent, being healthy, appreciating what you have – and I completely agree with all of it. But I want to focus on the role of jealously and envy, which is often derided.

While social media has undoubtedly exacerbated social comparison and envy, their existence has been around for a long time:

“Whoever sang or danced best, whoever was the handsomest, the strongest, the most dexterous, or the most eloquent, came to be of most consideration; and this was the first step towards inequality…From these first distinctions arose … envy: and the fermentation caused by these new leavens ended by producing combinations fatal to innocence and happiness.” Rousseau, On the Origin of the Inequality of Mankind

The natural response can be like the Stoics, to limit the exposure and stop comparing yourself to others:

“How much time he gains who does not look to see what his neighbor says or does or thinks, but only at what he does himself.” – Marcus Aurelius  

But while removing yourself from the barrage of updates and comparisons is essential to focus on improving yourself, that’s likely not enough. One of the methods for well-being is to identify something you want to improve, focus on it relentlessly, and compare yourself to your previous self. Not to other people who have what you want. Even imperceptibly small daily steps compound over time to make a big difference.

But while that gives you a way how to improve, it’s less clear what you should focus on.

Envy is as Envy does

Envy and jealousy aren’t considered good things for a good reason. They feel bad mentally and physically. They gnaw at you, and make you doubt yourself. They’re also often considered immoral; something we should be ashamed of feeling and thus hide. But they’re also normal emotions most have experienced. Sometimes they’re unproductive and should be restrained, but they may actually be quite useful in pointing us toward what we should focus on.

Although we still don’t quite understand what emotions are, they’re not completely mysterious either. We’ve come to understand that our experience of anger, happiness, or envy, is the result of unconscious and biochemical reactions in our brain that while similar, are not the same for all of us. Studies have also found that emotions highlight what’s important for us to pay attention to; we remember things better for longer when there is an emotion involved. Our personal blend of culture, experiences, and genetics gives rise to different emotional reactions from different phenomena for each of us.

For example, for many reasons I’m not a bit envious of Beyoncé’s success, but if I was a musician, or had known her school, or wanted to be famous, I may be. More likely my jealousy is aroused from a good idea that someone had or academic paper published on a topic I’m interested in, or a previous colleague of mine getting a promotion or launching a successful product. We don’t tend to be jealous indiscriminately, but when there is some closeness or familiarity, some perceived shared trait or path that could have been followed.

The problem is it’s often hard to pinpoint the target of an emotion like envy; we see something, we feel something, but it takes time and introspection to figure out what the root cause really is. It may start from the visible perception of another’s achievements, but underneath there are traits, skills, or circumstances that are worthy of admiration and contributed to that achievement. When you recognize those traits you can distill them into goals, and then they serve as a direction for you to strive toward.

When envy is inevitable, it should be used as stimulus. -Bertrand Russell

Studies have also found that “benign envy can lead to risk taking and self-improving behavior”.  The point isn’t to feel envy and have it build toward resentment. We likely won’t get exactly what we want, and it’s possible our feelings are misguided, but as Nietzsche wrote by acknowledging the target of our envy as an aspiration and embracing it we understand where we can grow to be more fulfilled. Along with other emotions, envy helps humans direct our efforts, but machines operate quite differently.

Emotional Algorithms

Emotions are often positioned as counter to intelligence. While intelligence has many definitions, often it is defined as the ability to learn, integrate information, and apply it to solve problem with self-awareness and conscious reasoning. While machines aren’t self-aware or conscious, we do build AI systems to solve problems that would require those traits in humans.

On the other hand, emotions have long been thought of as interfering with our reasoning. Many behavioral and economic theories of the recent centuries were built on the notion of a rational actor making reasoned, rational choices. Behavioral research however has shown this to be lie: human decisions are usually based on biased heuristics and emotional reactions, not rational thought.

However, this dichotomy of emotions as separate from intelligence is itself misleading. While emotions may be unconscious and perceived as irrational, the existence of similar limbic systems, which give rise to emotions, in the brains of much more primitive animals suggests emotions are also likely a type of intelligence developed for our survival. Emotions and feelings may be biochemical algorithms suited to help us navigate the world by performing quick pattern recognition and instigating behavioral responses without taxing our limited conscious cognitive abilities.

Let’s consider emotions to be a set of pre-tuned algorithms that have “learned” necessary basic behaviors over millennia and passed that knowledge to you in your genetic code. Your instincts may malfunction, and you may be afraid of things you shouldn’t be, or attracted to what is harmful: we all have flaws. But that doesn’t mean you should discount them. On the contrary, by paying attention to them you can fine tune those instincts through your active, conscious efforts, and even override them.

Our quick, emotional response, described by Kahneman as system 1, can be loosely compared to how we build AI systems using machine learning, from the bottom-up, using data to fine tune a model for specific tasks through experiential learning. The conscious, rational ability, described as system 2, can analogously be compared to rule- or logic-based AI systems built top-down, directly encoding knowledge and directions.

In order to build a system with machine learning, you need to specify how the system will measure its success as it’s learning. This determines what it considers valuable, and conversely, what it’s penalized for not performing well on. The algorithm will dutifully make calculations and optimize toward any predetermined value function you impose. Human emotions like envy provide us with a recognition that something of value is missing: our physical or emotional suffering can be interpreted as a sign we’re penalized for it, and if we take steps to optimize our lives toward it we may be better off. An emotional reaction provides initial motivation for doing something different than what we already are. The approach to machine learning shares some characteristics with system 1, but taking a step back to the broader view of how the two systems work together, there is no analogous recognition from the algorithm that something is missing to serve as a motivation for changing the value function. That seems to be one crucial missing piece for intelligence.

An Emotional Turing Test

The Turing Test and its variants, where a machine tries to imitate a human in conversation, are considered one major benchmark for measuring intelligent behavior. Usually the question is posed as can machines think? But at our core humans are an emotional animal. It’s much more natural for us to have irrational emotional responses, and only afterward layer on rational thinking. Maybe a more complete measure of intelligence would be based on the ability to both leverage your emotions to direct conscious thinking and learning, and ignore and override them when necessary.

AI systems can be trained to identify human emotions, with applications like sentiment analysis or facial recognition, but that doesn’t translate to experiencing those emotions, just recognizing them. I’m not suggesting we should translate the experience of human emotions in a machine, but would something that serves a similar role be useful? If so, what would it be?

What if an algorithm could recognize when the value function it’s trying to optimize wasn’t actually helping it learn what it’s trying to learn, or do what it’s trying to do, and it updated the value function on its own to perform better. Not simply choosing from a set of predetermined value functions, but really adapting what’s it’s focusing on and how it’s learning based on the data it’s encountering. That would likely require a system of algorithms working together, with some monitoring what others are doing and adjusting the entire system accordingly, which gets us much closer to human modes of collaboration between system 1 and 2. I wouldn’t call that system conscious, but if a future Alexa got “jealous” of a future Siri for how many languages it speaks and added another language by itself, it’s getting pretty close.

Leave a Reply

Your email address will not be published. Required fields are marked *