Joint computational complexity, and the "buy-one-get-one-free conjecture"
Below is a simple-to-state open question, stemming from this paper of mine from CCC'09. First, I'll state the question; then I'll give some background, explaining how it's an instance of a more general and significant problem.
The question
Let's consider the standard two-party model of communication complexity. Given inputs x and y to Alice and Bob respectively, suppose there are 3 functions the two parties are interested in evaluating on these inputs---let's call them F(x, y), G(x, y), H(x, y).
Question: is there a collection of total functions F, G, H, and a positive value T, such that:
(i) any one of F, G, H requires at least T bits of communication to compute;
(ii) any two of F, G, H can be computed in (1.01 T) bits of communication, on a common input (x, y);
(iii) but, computing all three of F, G, H on a common input requires at least (1.99 T) bits of communication.
I believe such a collection exists. We can call this the 'buy-one-get-one-free conjecture': Think of T as the individual 'price' of the 'items' F, G, H; we want to arrange a special 'deal' where the second item is essentially free, but one has to pay full-price for the third item.
Now if you think about it, what we're looking for is pretty strange. The function F should be efficiently computable in at least two 'essentially different' ways---one of which also gives us the value of G, and one of which gives H---yet there should be no efficient scheme to compute F that gives us G and H simultaneously. (This property seems easier to contrive when the inputs x, y are assumed to have a special, correlated form; I rule this out by insisting that F, G, H be total functions.)
The question makes equal sense when posed for other models of computation. In my paper, I proved the corresponding conjecture in the decision tree model of computation, as a special case of a more general result--see below. Communication complexity could be a reasonable model to attack next.
Please note: While this conjecture may require lower-bounds expertise to resolve, I believe that anyone with a creative spark could make an important contribution, by coming up with a good set of candidate functions F, G, H. Please feel encouraged to share any ideas you might have.
Background on the question
Let cc(F) denote the (deterministic) communication complexity of computing F(x, y). Next, let cc(F, G) denote the communication complexity of computing F(x, y) and G(x, y)---on the same input-pair (x, y). We define cc(F, H), cc(G, H), and cc(F, G, H) similarly.
Together, we think of these various quantities as summarizing the 'joint complexity' of the collection F, G, H. Of course, this notion can be extended to collections of k > 3 functions; the joint complexity is summarized by giving the communication complexity of all 2^k subsets of the collection. Let's let JC denote the function that takes as input a k-bit vector, and returns the complexity of computing the corresponding subcollection. So, in our 3-function example, we have
JC(1, 1, 0) = cc(F, G) and JC(0, 0, 1) = cc(H).
The question we want to ask is: what kinds of behavior are possible with the joint complexity, if we allow the functions F, G, H, etc. to be chosen arbitrarily? In other words, what different types of 'efficiencies' can arise in a collection of computational tasks (in the communication model)?
A little thought reveals some obvious constraints:
1. the joint complexity function JC must always be nonnegative and integral-valued, with JC(0) = 0.
2. monotonicity: Enlarging the subset of the functions to be computed cannot decrease the complexity. For example, we always have cc(F, G) >= cc(F), which translates to JC(1, 1, 0) >= JC(1, 0, 0).
3. subadditivity: Taking the union of two subsets of functions to be computed cannot increase the complexity beyond the sum of the individual complexities of the subsets. For example, cc(F, G, H) <= cc(F, G) + cc(H), since we can always compute (F, G) in an optimal fashion first, then compute H optimally afterwards.
(Technically, this assumes that in our model both players always know when a communication protocol halts, so that they can combine two protocols sequentially without any additional overhead. No big deal, though.)
Now, a little further thought reveals that… well, there really aren't any other obvious, general constraints on the joint complexity! Let's call C an Economic Cost Function (ECF) if it obeys constraints 1-3. We are tempted to conjecture that perhaps every ECF is in fact equal to the joint complexity (in the communication model) of some particular collection of functions.
There are two things wrong with this conjecture. First, it's false, as can be seen by a simple counterexample: namely, the "buy-one-get-one-free" example, with T set to 1. That's how I stumbled onto this example, and is one reason why I find it interesting.
However, if we relax the problem, and just ask to realize some scalar multiple of C as a joint complexity function, this counterexample loses its force.
The second thing wrong with the conjecture (in its relaxed form) is that, even if true, it'd likely be impossible to prove. This is because determining the exact computational cost of even modestly-complicated tasks is just way too hard. So I propose a doubly-relaxed form of the conjecture: I conjecture that if C is an ECF, then there is a joint complexity function that is a good pointwise approximation to some scalar multiple of C. (Here we allow a (1 +- eps) multiplicative error.)
In my paper, I managed to prove the corresponding conjecture for the model of decision trees (aka deterministic query algorithms). Several interesting ingredients were needed for the proof. Now, why do I believe the conjecture should also hold true for the communication model? In a nutshell, I think it should be possible to 'embed' tasks in the query model into the communication model, by a suitable distributed encoding of each bit, in such a way that the relative costs of all computational tasks are approximately preserved. If this could be shown, the result in the communication model would follow from my result for decision trees. (See the paper for more details.)
We may not be ready for an attack on the general conjecture, however. In particular, we seem to require a much better understanding of so-called 'Direct Sum problems' in communication complexity. Thus, I offer the 'buy-one-get-one-free conjecture' as a simpler, more concrete problem on which we can hope to make progress sooner.
In the decision tree model, my result allows us to realize an ECF of the 'buy-one-get-one-free' type as a joint complexity function; but I don't know of any method for this that's significantly simpler than my general construction. Even finding such a simpler method in the decision tree model would be a very nice contribution, and might lead to new ideas for the more general problem.
Labels: complexity
7 Comments:
Thanks for an interesting problem! Since you invite comments I'm tempted to "offload" what popped into my head. I thought of sorting, where the total function maps an element to its position after the sort. I thought there had to be something in distributed sorting that corresponded. But that was as far as I got :-)
By Anonymous, at 3:49 PM
Is it obvious that such functions cannot exist if the constants 1.01 and 1.99 are replaced by 1 and 2 respectively?
By Anonymous, at 7:53 PM
Well, it's not obvious to me. Proving this would be a nice contribution. You could work in the query model for starters.
By Andy D, at 11:49 PM
Hey Andy!
I can't quite give you an example of three functions with the "buy one get one free" property but I can do "buy one get the second one half price". I think you might be interested in this as well. Actually, to be more accurate, I can give three functions F,G,H such that computing any one of them costs T, computing any two of them costs 4T/3, and computing all of them together costs 2T, all in the deterministic two-player communication complexity model, of course. The ideas seem flexible enough to allow a bunch of freedom (although I don't know if they'll go as far as proving your general conjecture). The ideas are not too inspirational, I'm afraid. Nothing deep.
Here is a warmup: Let's design three functions F,G,H, such that cc(F)=cc(G)=cc(F)=T, cc(F,G)=cc(F,H)=T, and cc(G,H)=cc(F,G,H)=2T. (In fact, some of these have an additive sqrt(T); I'll be hiding some lower order terms). The functions will not have boolean output, they'll have big outputs, but I'll say later how to correct this to get boolean outputs.
The structure of the input is as follows. There is some parameter k, and let r=k^2+1. The input to Alice is 2r vectors, denoted a_1,...,a_r,b_1,...,b_r. each vector is of length n and its elements are from Z_k. The input to Bob is of the same structure, and has vectors c_1,...,c_r,d_1,...,d_r.
Now, the goal of the function F is to find two indices i,j such that [a_i,c_i]=[a_j,c_j] and [b_i,d_i]=[b_j,d_j]. Here, [.,.] denotes inner product modulo k. I'm using square brackets because HTML sucks. Note that such i,j always exist because of the pigeonhole principle.
The goal of the function G is to compute the inner products [a_i,c_i] for all i. And the goal of the function H is to compute [b_i,d_i] for all i.
Obviously, the cost of computing G is nrlogk. (By an easy direct sum argument and the well-known lower bound for inner product). The cost of computing H is also nrlogk. The cost of computing both G and H is 2nrlogk. But once you computed either G or H, then you get F for really cheap: you can pick k+1 "candidates", and just compute the inner products on them, so one you computed G, the additional cost of computing F is n(k+1)logk, which is something like sqrt(r)*nlogk. In other words, cc(F,G)=cc(F,H)=cc(G)*(1+o(1)), but cc(H,G)=2*cc(G). Also, cc(F)=cc(G)*(1+o(1)). One disclaimer: I didn't think about all the details, so take the above with a grain of salt, I might have gotten something wrong. I hope it's correct.
Now, what if we want boolean functions? Well, instead of a complicated function like F, G or H above, replace them by F',G' and H' respectively. F'(input) is equal to func(F(input)) where func is come complicated boolean function (think random function). It should be pretty obvious that computing func(x) requires computing x. Proving this kind of thing is non-trivial, but you should be able to choose a more concrete "func" and use some standard direct-sum arguments (I'm pretty sure I know how to do it; ask me if you want details).
Ok, finally: I promised in the beginning a collection A,B,C of functions such that computing one of them costs T, two cost 4T/3, and all three cost 2T. Here's how to do this. Choose three disjoint inputs for F,G,H as above. Call these inputs x,y,z. A will ask to compute F(x),G(y),H(z), B will ask to compute F(y),G(z),H(x), and C will ask to compute F(z),G(x),H(y).
Now, if we just want to compute A, then this costs three times like computing G. If we want to compute both A and B, then we get F(x) and F(y) for free from the fact we're computing H(x) and G(y), so the total cost is like four times computing G. And finally, if we want to compute everything, then it's like six times computing G.
Still need to double-check everything, but I think it all makes sense.
By Elad Verbin, at 9:06 PM
Following my previous comment, I think I see how to achieve "buy one get one free", i.e. exactly what you wanted. (Again, this needs re-checking.)
As before, all vectors are of length n over Z_k. As before, r=k^2+1. Alice now has three sets of vectors: a_1,...,a_r, b_1,...,b_r, c_1,...,c_r. Bob also has three sets of vectors, d_i's, e_i's and f_i's. Function A asks you to find i,j such that [a_i,d_i]=[a_j,d_j] and [b_i,e_i]=[b_j,e_j]. Function B asks you to find i,j such that [a_i,d_i]=[a_j,d_j] and [c_i,f_i]=[c_j,f_j]. And function C asks you to find i,j such that [b_i,e_i]=[b_j,e_j] and [c_i,f_i]=[c_j,f_j].
To compute both A and B cheaply, we compute all inner products [a_i,d_i], and we find k+1 i's such that their [a_i,d_i] are all the same. Then computing A and B is both very cheap. Similarly, you can compute both B and C cheaply by computing all [c_i,f_i]'s, and both A and C cheaply by computing all [b_i,e_i]'s. But computing all three things together is more expensive, and will cost twice as much. Quantitatively: computing any two of them together costs rklogn*(1+o(1)), but computing all three costs at least 2rklogn. If I'm not wrong.
By Elad Verbin, at 7:11 AM
One more comments: Actually, the functions I show in both my comments above are not total unctions at all; rather, they are relations. So, I really haven't answered your question at all, I just answered it for relations. Saying that, they look like "almost functions", if you see what I mean. So I suspect it wouldn't be too hard to get a version with the same ideas where everything is indeed total functions, but I could be wrong.
By Elad Verbin, at 7:19 AM
Hey! Coach Factory Online simply wanted to say your website is one of the nicely laid out, most inspirational I have come across in quite a while.
By Coach Factory Online, at 2:28 AM
Post a Comment
<< Home