Computers map our driving routes, buy our groceries and have even been known to write poetry. But as society increasingly depends on technological interfaces in more conflicted areas, ethical and legal concerns also grow. Cris Moore, a professor at the Santa Fe Institute whose work focuses on computer science, math and physics, will deliver two lectures on "The Limits of Computers in Science and Society" as part SFI's Stanislaw Ulam Memorial Lecture Series. Moore served two terms as a city councilor for District 2 starting in 1994. This interview was edited and condensed for style and length.

SFR: The first lecture discusses why some problems are easy for computers to solve, and others are hard or impossible. What are some examples?

CM: So when you ask Google maps to find you the shortest path from A to B, there are a lot of possible paths, a huge number of possible paths—but it's pretty easy for an algorithm to zero in on the shortest one. But there are other problems, like designing an airline schedule. Or there's this classic thing, the traveling salesman problem, where finding the best solution is like looking for a needle in a haystack. Different problems have different structures: Some have the kind of structure that lets us zoom in quickly on the solution; some we can't get a handle on. … There are questions that are deep inside logic and mathematics, like, for instance, Alan Turing, who was famous for breaking the Nazis' Enigma code in England, proved there's no program which can predict what other programs will do because you could ask it about itself and throw a twist in there and create a paradox.

That's the halting problem?

Exactly. And this is close to [Kurt] Gödel's incompleteness theorems that there are things that are true but that we cannot prove—and I don't mean things like sunsets are beautiful. Things even in math. … So, the two lectures are very separate, and you don't have to come to them both. If you like the beauty of mathematics, come to the first. If you're concerned about the rise of computers in society and the impact on us as humans, come to the second. And if you like both of those things, come to both.

Was there a particular issue that prompted your second lecture on the rise of computers in society?

The second lecture focuses on the use of algorithms in the justice system, and especially the algorithm that recommends to a judge whether you should be released or detained if you're arrested. New Mexico, and to a greater extent California, and some other states … are trying to move away from the money bail system where … whether you stay in jail or not is a matter of how much money you have.

The justice system is historically extremely biased. I think everyone knows that African American and Hispanic people are much more likely to be arrested than white people and, when they're arrested, more likely to be charged with a crime, given longer jail sentences and on and on. Our justice system is terribly biased, as well, against low-income people. … So a lot of people who care about reforming this system are enthusiastic about having an algorithm, a computer program, which looks at your criminal record, for instance, and comes up with a kind of score and based on that score makes a recommendation to the judge about whether you should be released or not. The hope is these algorithms are more objective and perhaps less biased than human judges.

But there's a controversy over this. Two years ago, Pro Publica heavily criticized one of the algorithms that's currently in use, COMPAS [Correctional Offender Management Profiling for Alternative Sanctions] and argued that it is, in fact, racially biased. There was pushback from some people, including independent analysts, saying it's not really true, and it turns out even figuring what we mean by having an algorithm being fair is a thorny issue.

This is the issue of algorithmic fairness?

Yes. Algorithms are just as biased as the data we give them. If we give them historically biased data, they are going to produce biased results. There's a danger that they're going to take the biases of society and put them inside this gleaming new box and hide them behind them this sheen of objectivity and math, which nobody can argue with. … Algorithms might play a good role, but we need to apply a lot of critical thinking to them and demand a lot of openness and, at every stage, think about whether they're going to heal our biases or amplify them.

My understanding is that a piece of this issue is that COMPAS is proprietary.

That's another huge issue. … The COMPAS formula is secret, so it's not easy for independent analysts … [to] understand how it's working internally. Now, in a bunch of states, and in Bernalillo, there's another algorithm that's in use, which comes from the [Laura and John] Arnold Foundation … [which] is really sincere about wanting to reform the criminal justice system. Unlike COMPAS, the Arnold formula is public. It's a simple point system, and anybody can see how it works.

Does your lecture have a call to action?

I talk a lot about the need for transparency. I also think the results need to be presented in a way that embraces the uncertainty in them. … It's important for judges and juries, defendants, parole boards, whomever is going to use these algorithms, to understand how much uncertainty they really have.

Does this interest in criminal justice and fairness indicate you might be running for public office again in your lifetime?

No. But I would love to work with people who want to figure out what New Mexico's policy should be on these algorithms.

Cris Moore: The Limits of Computers in Science and Society
7:30 pm Monday and Tuesday Sept. 24 and 25. Free; registration recommended.
Lensic Performing Arts Center,
211 W San Francisco St.,