News, and some Number Theory
Time for a personal update: I'm enjoying myself greatly here in Cambridge, and was recently admitted as a transfer student to MIT's EECS department. The move has both personal and academic advantages for me, but let me emphasize that I think UCSD has a lot to offer prospective students, and in addition is home to many cool people whom I miss... I would be happy to offer advice about either school.
When I started writing here as an undergrad, I knew very few people in the worlds of TCS and Complexity, and the blog was a way of reaching out to a community I wanted to join. Now, thanks in part to blogging (thru which I met Scott, my advisor), I'm immersed in that community at a school with a large, vibrant Theory group. This is to say that, though writing here still interests me, it no longer answers an urgent personal need, and I am most likely to post new material in response to a request for specific topics or post types.
Today I was perusing a favorite book of mine, 'Gems of Theoretical Computer Science' by U. Schoning & R. Pruim. One chapter partially treats the undecidability of determining whether systems of Diophantine equations (polynomial equations where solutions are required to be integral) have a solution. This was one of Hilbert's problems from 1900, solved in 1971 and full of deep math.
The authors pose the following exercise: show that the version where variables must take nonnegative values, is reducible to the integer version. (And vice-versa; also the rational-values version is reducible to the integral version; the converse appears to be open...)
Think about it...
The reduction they give: Given a set of polynomial equations {P_i(X_1, ... X_k)}, we want to determine if they're simultaneously satisfiable with nonnegative values. Introduce, for each j <= k, the variables Y_{j, 1}, Y_{j, 2}, Y_{j, 3}, Y_{j_4}.
Now add to the system, for each j <= k, the constraint that
X_j = Y_{j, 1}^2 + Y_{j, 2}^2 + Y_{j, 3}^2 + Y_{j_4}^2.
Claim: the new system is satisfiable over the integers iff the original one is satisfiable over the nonnegative integers! The proof, of course, is an easy consequence of Lagrange's Theorem that every nonnegative integer is the sum of 4 integer squares.
So, my question is, could a reduction be found that doesn't rely on Lagrange's Theorem? Or its weaker variants where the constant 4 is replaced with some constant c > 4. Or maybe for some constant c, the proof is really so simple that I will be satisfied that we are performing this reduction with the cheapest tools.
If our plan in the reduction is to constrain the original variables in a form analogous to the above, namely
X_j = Q_j(Y, Z, ...),
where Y, Z, ... are new integral variables, is there any way around proving a version of Lagrange's Theorem? Generally we show that polynomials are identically nonnegative by expressing them as a sum of one or more squares, e.g.,
Y^2 + Z^2 - 2YZ = (Y - Z)^2.
Using this template, we'd be reliant on Lagrange. However, in his thesis, Minkowski conjectured that there exist nonnegative real polynomials *not* so expressible, and this was proved by Hilbert. A simple example I found online is
Q(X, Y, Z) = X^4*Y^2 + Y^4*Z^2 + Z^4*Y^2 - 3X^2*Y^2*Z^2,
and another counterexample is derived systematically in the classy and useful inequalities book 'The Cauchy-Schwartz Master Class' by J.M. Steele. (Artin proved, however, that every poly that's identically nonnegative, is expressible as the sum of finitely many *rational* functions squared.)
Could it be that one of these creatures is easier to use in the reduction (both to prove it's nonnegative, and ranges over all nonnegative values)? Somehow I doubt it. Anyways, I just wanted to point to an instance where a reduction one would expect to be straightforward seems to require fairly nontrivial understanding of the problem domain.
When I started writing here as an undergrad, I knew very few people in the worlds of TCS and Complexity, and the blog was a way of reaching out to a community I wanted to join. Now, thanks in part to blogging (thru which I met Scott, my advisor), I'm immersed in that community at a school with a large, vibrant Theory group. This is to say that, though writing here still interests me, it no longer answers an urgent personal need, and I am most likely to post new material in response to a request for specific topics or post types.
Today I was perusing a favorite book of mine, 'Gems of Theoretical Computer Science' by U. Schoning & R. Pruim. One chapter partially treats the undecidability of determining whether systems of Diophantine equations (polynomial equations where solutions are required to be integral) have a solution. This was one of Hilbert's problems from 1900, solved in 1971 and full of deep math.
The authors pose the following exercise: show that the version where variables must take nonnegative values, is reducible to the integer version. (And vice-versa; also the rational-values version is reducible to the integral version; the converse appears to be open...)
Think about it...
The reduction they give: Given a set of polynomial equations {P_i(X_1, ... X_k)}, we want to determine if they're simultaneously satisfiable with nonnegative values. Introduce, for each j <= k, the variables Y_{j, 1}, Y_{j, 2}, Y_{j, 3}, Y_{j_4}.
Now add to the system, for each j <= k, the constraint that
X_j = Y_{j, 1}^2 + Y_{j, 2}^2 + Y_{j, 3}^2 + Y_{j_4}^2.
Claim: the new system is satisfiable over the integers iff the original one is satisfiable over the nonnegative integers! The proof, of course, is an easy consequence of Lagrange's Theorem that every nonnegative integer is the sum of 4 integer squares.
So, my question is, could a reduction be found that doesn't rely on Lagrange's Theorem? Or its weaker variants where the constant 4 is replaced with some constant c > 4. Or maybe for some constant c, the proof is really so simple that I will be satisfied that we are performing this reduction with the cheapest tools.
If our plan in the reduction is to constrain the original variables in a form analogous to the above, namely
X_j = Q_j(Y, Z, ...),
where Y, Z, ... are new integral variables, is there any way around proving a version of Lagrange's Theorem? Generally we show that polynomials are identically nonnegative by expressing them as a sum of one or more squares, e.g.,
Y^2 + Z^2 - 2YZ = (Y - Z)^2.
Using this template, we'd be reliant on Lagrange. However, in his thesis, Minkowski conjectured that there exist nonnegative real polynomials *not* so expressible, and this was proved by Hilbert. A simple example I found online is
Q(X, Y, Z) = X^4*Y^2 + Y^4*Z^2 + Z^4*Y^2 - 3X^2*Y^2*Z^2,
and another counterexample is derived systematically in the classy and useful inequalities book 'The Cauchy-Schwartz Master Class' by J.M. Steele. (Artin proved, however, that every poly that's identically nonnegative, is expressible as the sum of finitely many *rational* functions squared.)
Could it be that one of these creatures is easier to use in the reduction (both to prove it's nonnegative, and ranges over all nonnegative values)? Somehow I doubt it. Anyways, I just wanted to point to an instance where a reduction one would expect to be straightforward seems to require fairly nontrivial understanding of the problem domain.
Labels: computability, general math
7 Comments:
Best wishes for your research at MIT!
aravind srinivasan
By Anonymous, at 10:21 AM
Thanks, Aravind!
By Anonymous, at 10:54 AM
You write above "also the rational-values version is algorithmically equivalent." Clearly the rational values version reduces to the integral one. However, I seem to recall that the reverse reduction is an open problem. You might want to check your reference.
David Speyer
By Anonymous, at 11:10 PM
woops... you're right, of course! Anyone know if there's a consensus or conjecture that supports one way or the other?
I'll alter the post accordingly
By Anonymous, at 11:28 PM
Regarding the question about whether the rational case is undecidable, the person I know who has thought hardest about it is Bjorn Poonen. According to his research statement, he "suspects that this ... is unsolvable in general and ... has
been proving theorems heading in this direction."
To my understanding, the plausible arguments on each side are the following. Note that this post just about exhausts my knowledge.
For undecidability: the first order theory of rationals is undecidable as is the Diophantine theory of many rings similar to Q. Also, no one has any clue what an algorithm would look like. There was a conjectural algorithm, involving something called Brauer-Manin obstructions, but a recent paper of Poonen's shows that this algorithm is not always correct.
Against undecidability: the obvious way to prove undecidability is to find a Diophantine way to describe Z within Q. More generally, one could hope to build some other diophantine subset of Q which is in bijection with Z, for which the operations of addition and multiplication on Z correspond to Diophantine relations. In every case where Hilbert's 16th problem has been resolved (to my knowledge), it has been by one of these strategies. However, such a construction would break a conjecture of Mazur's. (This conjecture states that, if we take the topological closure of the rational points in an algebraic variety, there are only finitely many connected components.)
I am unclear as to how strong the evidence for Mazur's conjecture is.
By Anonymous, at 5:27 PM
I just remembered that Poonen wrote a very readable survey on this subject for the Notices of the AMS.
http://math.berkeley.edu/~poonen/papers/insufficiency.pdf
By Anonymous, at 5:45 PM
Thanks, David! I'm don't think I'm qualified to understand the paper, but I'm glad someone's working on the problem.
We can at least show the problem (of determining rational solvability of a given polynomial) is NP-hard (easy exercise).
So, could we show a higher complexity bound?
A more out-there possibility: could it be that the problem is undecidable in general, but decidable for polynomials in k variables, for each fixed value k?
('fixed parameter tractability' for computability theory--hmm...)
By Anonymous, at 4:24 PM
Post a Comment
<< Home