“Why?” is one of the first questions we humans
ask. Developing ethical justifications
to support decisions is an essential part of my participation in the CEC: the
details of “why” influences life-and-death decisions made by patients,
caregivers, family, and health proxies. Facts are cold. I believe that real life decisions ultimately
depend on love.
The CEC
(Community Ethics
Committee) works hard to establish a rational basis for
conclusions and positions. But I’m
realizing that our emotions usually decide.
I’d like to think that love wins, but lots of emotions have to be
considered too.
When
our daughter was making college decisions, I was the rational engineer who got
out the whiteboard and made lists of schools, reasons pro and con, and mapped strategy
in a perfectly reasoned way. But neither
my daughter nor her mother looked at the reasoned charts, and I ultimately agreed
that we would trust our daughter’s gut.
What she felt was the right destination for the next four years
of her life was what mattered. Rational
decision making had almost no influence.
The
idea that our positions and decisions are based on feelings more than reason was
affirmed again when I read about religion’s
place in secular medicine:
some basic moral philosophy and meta-justifications are all we have at
the end of it. In any case, authors quit
before they get to the best part, the decision on life or death. I’m convinced now that it is our emotion,
“who and what we love” that counts.
What
are stock markets, after all? Numbers going up and down, they are the sum total
of investors’ emotions: greed, anxiety and hope. There is not so much fact and
reality in the investment world as an emotional index of how the investors feel
at that moment.
On
the hard ethical decisions that will have to be made by the computer and
software programs that will govern self-driving cars, what will the program
decide when the car is going at 70 miles per hour, and a child, mother, and
baby carriage dash into the road? It is
traveling too fast for brakes to help.
Doing nothing: the child, baby, and mother will surely perish. There is a choice: steering to the right, the car runs into
some elders at the bus stop who would be crushed. There’s a third option: the
program could decide to steer left, off the cliff, and commit car-i-cide,
killing both the passenger and destroying the car. What will your so-smart autonomous
self-driving car decide?
It
is hard enough for a human to make such decisions, never mind an emotionless
robot. In that impossible scenario, the
right decision for a person, will most likely be governed by who or what that
person loves most. A dearly loved family
member will probably trump any stranger.
To love someone so much that you will do anything for them also means a
choice to drive off the cliff and sacrifice your own life, is never entirely
off the table.
After
letting this stew for a while, and watching the current political spectacle, I
also came to the conclusion that not only “Love Wins”, but “Hate Wins” as
well. Our basest animal bigoted racist
xenophobic selves will use our love or our hate to drive our decisions. Sadly
for some, some decisions will be forced on them by some people who think they
know better how others ought to act. Will
our better selves come out and do the kind and compassionate thing?
Reason
alone is far from sufficient. When my health care proxy decides whether I live
or die, I hope it will be done with love and compassion.
Shukong Ou has been a member of CEC since
2011.