Religion Today

Monday, April 20, 2015

Robot Morality

In research worthy of science fiction writer Isaac Asimov’s “I, Robot,” Bertram Malle is working to design a moral robot. Malle is the co-director of Brown University’s Humanity-Centered Robotics Initiative, and his approach is to create a robot that can learn moral behavior from the people around it. Ideally, you would surround the robot with morally good people, and the robot would learn ethical beliefs and behavior from them.
Like a child and its parents, the robot then would be taught morality and behavior by the people looking after it. Of course, there would be no need to limit the teachers to just two people. Once beyond the basics, robots could even crowd-source their ethical education. When two principles it learns come into conflict, the robot could seek guidance and feedback from those it knows.
But what happens if the robot falls in with the wrong crowd? Perhaps the robot gets stolen by a criminal gang that teaches it how to be a thief or a murderer.
To avoid such a scenario, the robot should be equipped with a set of core rules that would guide its learning. Like Asimov’s “Three Laws of Robotics,” the guidelines would direct the robot away from doing harm and evil and toward doing good. The key question, then, is what are those rules?
Malle indicates these rules would need to include the prevention of harm to humans, like Asimov’s Law 1 “A robot may not injure a human being or, through inaction, allow a human being to come to harm,” as well as guidance concerning the politeness and respect required for smooth human interactions.
Another rule that would be needed is to treat all people the same, that is, according to the same ethical principles and behavior. As Malle puts it, “we can equip robots with an unwavering prosocial orientation. As a result, they will follow moral norms more consistently than humans do, because they don’t see them in conflict, like humans do, with their own selfish needs.”
The problem with human morality Malle identifies here is the selfishness of each individual. Selfishness often prevents humans from doing what they consider to be the morally correct act. Robots would not be diverted from moral behavior by selfishness because they lack a self. They have as much self-awareness as a TV or a refrigerator. They would be a moral machine, always behaving ethically, without any personal needs or desires to sidetrack them.
But there is a second problem with human morality, namely, to whom should ethical behavior apply? Humans are always joining with other people in groups, and people often treat members of these groups differently from those who do not belong.
Family members treat each other differently from the way they treat non-family members. Friends behave differently toward each other than toward mere acquaintances.
We conduct our relations with members of our religious organization differently from those who do not belong, or more importantly, from those who disagree with our religion. The fracas in Indiana about religious freedom and discrimination against gays is a case in point.
Other groups affect our behavior toward others. During an election, we behave differently toward members of different political parties.
Some people treat members of certain racial groups or ethnic groups different from those of our own. Just think about our current national argument over white police shooting black citizens, or the problems surrounding Hispanic immigration.
Once robots are programmed with the rule to treat all people the same, without regard to their group membership, these problems would be avoided. Since robots have no more self than a pickup truck, the human tendency to identify their self with groups would not take place. Robots would have no reason to treat Hispanics or Asians differently from whites. They would not behave toward evangelical Christians with one set of moral standards, toward Catholics with another and toward Muslims with a third.
In other words, robots would be more moral than human beings. Their ability to perform in a morally consistent manner toward everyone they meet would be superior to our own.
Of course, research into robotics has not yet reached the ability to program robots in this way, but scientists like Malle are working toward that goal. It is sobering to think, however, that robots could outperform humans not only in raw calculating and thinking power, but also in terms of ethical behavior.
Note: This essay draws from “How to Raise a Moral Robot,” by Bertram Malle, livescience, April 2, 2015 (http://www.livescience.com/50349-how-to-raise-a-moral-robot.html).

Labels: , , , ,

Wednesday, April 01, 2015

Religious Freedom and Christianity


There may be a lot of smoke, but the fire is pretty small. The nationwide political tempest around Indiana’s new Religious Freedom Restoration Act (RFRA) is much bigger than the law itself deserves.The bill’s proponents touted this as law to “protect” the religious rights of Christians (in particular) who do not want to support gay weddings because they do not believe in same-sex marriage. The bill’s opponents argue that it is a license to discriminate.
Neither is correct. An RFRA law is a tool that allows the accused in a discrimination lawsuit to use religious belief as a defense. It does not require the judge or the jury to agree that the belief provides the defendant a compelling reason to discriminate. In the 20 years since the first RFRA was passed (in federal law), no court case has been successful in permitting anti-gay discrimination.
In other words, RFRA laws provide no automatic right to discriminate on a religious basis. All they do is provide the mechanism for a case that would determine whether a person’s belief rises to a threshold sufficient for such discrimination. Never have anti-gay views, in any form, risen to that threshold.
Of course, future court cases may be different. But, it will be a long and arduous legal process, and the outcome may well be that RFRA laws do NOT permit religious discrimination against homosexuals by individuals or corporate entities.
Since the Indiana law gained national attention, a lot of ink (both physical and virtual) has been spilled discussing it. RFRA laws have become a political symbol in the ongoing national debate over same-sex marriage but, given their actual wording, they are a rather hollow symbol.
Why do Christians need protections from homosexuals? The president of the Family Research Council, Tony Perkins, gave one common answer: “The government shouldn’t force religious businesses and churches to participate in wedding ceremonies contrary to their owners’ beliefs.”
Are Christians so fragile that they are harmed by being employed in weddings they do not theologically agree with? 
Jesus taught his followers to be tougher than that. Should they be oppressed through violence or compulsion, they should not rise up and resist. If hit on one cheek, they should offer the other. If forced to walk a mile, they should go a second (Matthew 5:38-42).
Jesus continued this set of ideas by concluding, “Give to him who begs from you.” How does this apply? When a same-sex couple asks a Christian photographer to photograph their wedding, he or she should say yes, and be glad they are willing to pay!
Jesus consistently taught his devotees to love their neighbors. When questioned about who was a neighbor, he told the parable of the Good Samaritan. A Samaritan, a class of people despised by Jews, stopped to help a Jew who was beaten when no one else would. He even paid for medical treatment (Luke 10:25-37). And Jesus’ moral for this story? This is what a person must do to “inherit eternal life.”
Just so that his followers would not mistake his point, Jesus even required them to love their enemies, saying that otherwise they were no better than tax collectors (Matthew 5: 43-47).
When he describes his role in the Great Judgment, Jesus makes clear that he is on the side of the oppressed, of those who suffer discrimination. He identifies with the oppressed and says, “As you did it to the least of these, you did it to me.” (Matthew 25:31-46)
So, what would Jesus do? It is clear, from his own words, that Jesus would not approve of Christians discriminating against anyone. Those who do are in danger of losing their access to eternal life. Instead, they should be helping those they disagree with, even when they disagree with the outcome.
In the end, the question facing American Christianity is what do Christians want to be known for? Is Christianity the religion of discrimination, of treating people as second-class citizens, or is it the religion of loving neighbors and enemies as Jesus taught?

Labels: , , , , , , , , ,