Andy's Math/CS page

Thursday, November 08, 2007

Limits of Transcendence: Our Failure to Protect Prisoners

One thing I'd long been planning to post about is just how positive a role math has played in my own life. Since I came to enjoy it (relatively late--around senior year in high school), I have become: less restless; less materialistic; clearer and more self-reliant in my thinking; able to think longer and more fruitfully about all kinds of things, with nothing but a pen and paper.

All of these beneficial effects constitute, to my mind, a 'practical application' for math that warrants mentioning in the same breath as its uses in science and technology. In particular, I always thought these virtues seemed especially well-suited to empower and improve the constrained lives of prisoners. I know of one example--Paul Turan seems to have invented extremal graph theory in a Hungarian labor camp during WWII--but Turan was admittedly already a professional mathematician. Of course math isn't for everyone, but I would be very interested to know whether it's been taught specifically for enjoyment in prisons, and with what results.

It's an exciting idea to me, and I hope it excites others. However, it'd be irresponsible and dishonest to pursue it without acknowledging math's limits as a life practice. Coping with boredom is one thing, but, speaking personally, my mathematical thinking goes to pieces when I'm in acute distress or suffering. There's a limit to what we should reasonably expect to transcend through math, just as with art, religion, meditation, and other such 'soft' techniques... which brings me to what actually prompted this post: news from California that makes me ashamed of the state of affairs in my home state.


According to this article (see also this Times editorial), Gov. Schwarzenegger recently vetoed a bill that would allow nonprofits to distribute condoms in prisons. (He claimed it would conflict with the law against sexual contact between inmates.) It seems he did something similar two years ago. There are some condom distributions in CA prisons, and Schwarzenegger authorized one new, small-scale pilot program, but he derailed the state legislature's decision to make such distributions a statewide reality.

I hope the cruelty and stupidity of this decision speak for themselves, but I'll make a few comments to encourage further thought, because a decision this bad needs to be understood as well as denounced. To try and make (partial) sense of his decision, let me suggest that, in policymaking and debate around HIV/AIDS and other sexually transmitted infections, there are various gradations of the position that would block direct public health initiatives.

First, there are those who simply disapprove of a particular behavior, be it pre- or extramarital sex, same-gender sex, sex in prisons, or intravenous drug use. It doesn't necessarily lead them to disregard other social issues or form a blanket policy statement.

Then there are those who, through a combination of wishful thinking and manipulation, reach the idea that the health threat posed by risky behaviors might itself be the best deterrent against these behaviors, superior from a moral and public health perspective to interventions that would make those behaviors safer (condom distributions, needle exchanges, HPV vaccinations, etc). They accept arguments to this effect without sufficient critical thought, and neglect contrary evidence.

Then there are those who publicly espouse this best-of-both-worlds notion, but whose prolonged refusal to engage with the evidence leads one to suspect that either (a) they no longer feel compassion for victims of infection from their designated 'deviant' population groups, or (b) they feel that infection's (supposed) role in discrediting and deterring 'deviant' behavior outweighs its human costs (so, in a sense, they ally themselves with the epidemic).

Finally, at the far end you have those who feel only contempt for 'deviants', and openly exult in public health crises which they see as just punishment.

But can we explain Schwarzenegger's behavior as occupying a point along this spectrum? I'm not so sure. As soon as incarceration comes into play, the debate seems to get much more complex (i.e., crazier). Are these notions of morality and deterrence even operative in the (supposedly kinda-socially-liberal) Governor's decision? Is he offering a feeble cover for the prison industry from the already-widespread recognition that, in many cases, they fail to protect their inmates from sexual coercion? (For a discussion of the state of available statistics on prison rape see this Human Rights Watch page.) Or is he merely bowing to the time-honored tradition of abusing prisoners?

I would love to hear what others think, because I can't decide the issue myself. All that's clear to me is that he's taken citizens in a position of enforced vulnerability, who are contracting HIV at eight times the rate outside prison, and cut off one of the most important means of protecting them. (I don't believe the state was even asked to pay for the condoms....!)

Meanwhile--Governor Schwarzenegger, withdraw your veto!

Tuesday, November 06, 2007

Goofball DNA-Bonanza

Since this blog's readership is, I'm guessing, essentially a (very-)proper subset of the readership of Shtetl-Optimized, you may be familiar with his recent musings and observations on DNA rewriting rules (to which I refer readers before looking further at my post). I've been finding this stuff a fun alternative to my usual research (the last time I got this worked up over DNA might've been when I heard about ANDi the monkey), and have come up with three simple results in the past couple of days. The latest is a bit too long for Scott's comments section, so I wrote it up in LaTex and am putting it here.

I encourage all readers to generate, pose, and/or answer additional questions on related themes! Collaborative online research is lots of fun in my experience, and something I'm considering trying to do more of with this blog.


Let me explain the third result after reviewing the other two. But before I begin, I want to emphasize that I make no claims as to the biological realism of any of the models under consideration, though I'm happy to try to work out results in more realistic models if they still seem like interesting puzzles.

First, I showed that, in synthesizing a string x via Scott's model of DNA rewrite rules, the substring-reversal rule can be eliminated at the cost of increasing by constant factors the number of steps and the size of intermediate strings.

Then, I showed that a factor-2 increase in steps allows one to hold the intermediate string to within a factor 2 of the final string length at all times. This result, by constrast, holds whether one is trying to synthesize x de novo or to transform some other string x' into x (although I only wrote up the first case, the other is essentially the same analysis). In the process, I was especially pleased to introduce an analytical tool called 'garbage monsters'; Scott and I hope the terminology will catch on in mainstream biology.

The newest result is the following: suppose you want to implement an error-correcting code x --> C(x) via DNA rewrite rules, where the choices of operations made can depend on x; then unfortunately, something on the order of n/(lg n) operations are needed in the encoding process to ensure that the code is actually robust. A matching upper bound can be easily extracted from Scott's observations in his comments section, and it's a dispiriting bound because for that price, you could synthesize C(x) without even manipulating the string x that you're given! (A similar, simpler result holds for the decoding process, but I haven't written this up.)

For completeness, I'll reproduce my sketch proofs for the first two results below... but first let me note that monkeys continue to make news (thanks again, Chaya!).


***First Result, Comment 21 on Scott's post***

"So how powerful is the reversal operation, really?

Suppose we can produce string x in k applications of operations (1)-(4). I claim we can do it in 4k + 1 operations, without reversals. (Of course, this will show that the Lempel-Ziv approach is also a constant-factor approximation in our setting.)

First, note that, if you can produce x in k steps with (1)-(4), you can also produce x^R, i.e. the reversal of x, in k steps with (1)-(4), by ‘flipping’ each operation.

Second, note that you can produce x.x^R, their concatenation, in 2k steps by running both productions in parallel, interleaving steps on the two halves. If the i’th step in the original production of x was s, then the 2i’th step in the new production will be s.s^R.

Now, in this ‘joint’ production we eliminate reversals as follows: say in the production of x, at the i’th stage we wish to move from

s = a.b.c to s’ = a.b^R.c …

well, in the joint production, at stage 2i we’ve got the string

a.b.c.c^R.b^R.a^R

By two copy and two deletion operations, we reach

a.b^R.c.c^R.b.a^R,

as needed to advance both sides’ progress.

Finally, when the joint production’s finished, delete the right-hand side to yield x. That’s 4k + 1 steps."


***Second Result, Comments 30-31 on Scott's post***

"I believe I can show that, if we’re producing a string x of length n, we can keep the intermediate strings to length at most 2n, while incurring at most a factor-2 blowup in the number of steps used.

As Scott suggested to me in conversation, let’s keep track of which symbols in the intermediate workstring will and won’t make it into the final string x without deletion. If they won’t, call them ‘nonterminals’. (If a symbol will be replicated, and at least one of its descendants will be included in the final string, it’s not considered a nonterminal.)

First note that, if in a production of x there appears at some stage a block of consecutive nonterminals, we can modify the production so that they are treated as a ‘unit’ in subsequent steps, i.e. no string operation takes an endpoint that’s properly inside the block. They get shuffled around together and eventually destroyed together. The production’s length is not affected.

In a stage of such a production, for analysis purposes, we call a maximal block of nonterminals a ‘garbage monster’. Each time a new garbage monster is created (and many may be created by a single copy operation–even if there were no nonterminals in the original substring), we give it an unused, goofy monster name.

When string manipulations cause two garbage monsters to become adjacent, we combine them under one of their two names (and say that that one ‘eats’ the other one).

Key claim: in our modified type of production, at most one garbage monster gets eaten or destroyed at every step! Draw some pictures to see if you agree.

Now, with this fact in hand, we can modify the production again and argue it gives what we want.
Here’s the strategy: given a production of x of the type we described, modify the production so that between each step, we stop and delete all garbage monsters, one deletion per monster.

This is a bit wasteful since some of these might’ve been eaten by other garbage monsters; but if our production had k steps to begin with, only at most k monsters got eaten total, and all garbage monsters must die somehow, so we only add at most k new steps.

Finally, observe that the string is culled to length at most n after every new round of ‘executions’, and since the workstring’s length is at most doubled by any string operation, its length is bounded by 2n throughout."