Andy's Math/CS page

Saturday, September 08, 2007

On the Move

Time for a personal update: this week I began a stay as a visiting student at MIT's CS and AI Lab (CSAIL)--a very exciting place to be for a young would-be theorist. For this privilege I have friend, fellow blogger, and new MIT prof Scott Aaronson to thank--thanks, Scott!

While here, I'd absolutely love to meet, get to know, trade ideas or collaborate with any Boston/Cambridge readers. Drop a line or come by my office, 32-G630. I'd also welcome advice on where to eat, how to furnish an apartment on the cheap, how to stop getting lost constantly around town and campus, and all that good stuff.

Speaking of getting lost, a small PSA: did you know that in a long random walk on a finite undirected graph, you can expect to appear at any given node with frequency proportional to its degree? A node's location and degree of centrality in the graph are ultimately irrelevant. Equivalently, every edge gets traversed with the same limiting frequency, and with equal frequency in either direction.

Despite its simplicity, this was an eye-popper for me as an undergrad, and my favorite result in our probability seminar, along with the theorem that the harmonic series 1 + 1/2 + 1/3 + ... converges with probability 1 when each term is instead given a random sign.

11 Comments:

  • Both of those results seemed intuitive to me in that seminar. I always think of the harmonic series to be the cuttoff for things diverging or converging to as soon as you make the signs random it has to go the other way. What I found suprising from probability seminar are some of the results about "simple" one dimensional symmetric random walks from Feller chapter III.


    I have a relatively vague probability question for you. You can interpret the vague parts as you wish, in fact finding a meaning for these things is part of the question.

    Imagine walking on the integers, starting at zero. The walker moves +2 and -1 equiprobabally. What "fraction" of the integers are visited at least once? First figure out what that should be asking. Then answer it.

    By Blogger G, at 11:28 AM  

  • Hi George. (is this George Dahl from Swat? I can't view your profile.)

    I agree that other seminar results were more surprising; these were just the ones that charmed me the most and made me want to learn more probability.

    Also, in the random-sign setting, the harmonic series ceases to be the cutoff point for convergence--I believe it's p = 1/2, see this JSTOR note
    http://www.jstor.org/view/00034851/di983720/98p0072n/0

    As for your question, I like it. I'll see what I can do, and put a solution in a new post if I solve it.

    By Blogger Andy D, at 1:37 PM  

  • Hi George and Andy,

    First, for George's question: All integers are visited at least once. More accurately, let p(n) be the probability that you'll visit all of the integers -n^{1/3}...+n^{1/3} after n steps. Then the series p(n) tends to 1 as n tends to infinity. (I think this works with 1/2 instead of 1/3, but I don't want to commit to it right now; maybe then p(n) is only bounded from below by a constant). So, this means that you'll visit all of the line.

    I won't prove the whole thing, but let me try to convince you: Say that you accept gambler's ruin as a given. (i.e. if you start from 0, you'll hit every integer eventually. Now, let's say you start from 0. Eventually, you'll hit +100. Then, eventually you'll hit -100, because hitting -100 from +100 takes exactly the same time as hitting -200 from 0. Now, after you hit -100, eventually you'll hit 200, and so on.

    Now, about something Andy said: I didn't know the nice result about summing the Harmonic series with random signs. But now that Andy told us about it, I see a simple back-of-the-envelope explanation that also gives the p=1/2 cutoff (more or less). It goes like this: Let's partition the Harmonic series into chunks: first {1}, then {1/2,1/3}, then {1/4,1/5,1/6,1/7}, and so on. The i-th chunk is of size 2^i, and consists of the numbers {1/2^i,..,1/(2^{i+1}-1)}. Now, instead of this, I'm going to make-believe that the i-th chunk consists of 2^i numbers all equal to 1/2^i. This shouldn't matter for the computation, because it's only a factor of 2. Now, what do you expect to get when you sum up 2^i copies of 1/2^i with random signs? That's right, something between -1/2^{i/2} and +1/2^{i/2}. This is because of Gambler's ruin -- after n steps of Gambler's ruin, you're likely to be at locations -sqrt(n) to +sqrt(n).

    Now, it's easy to see that the partial sums have no danger of acting like -1,+1,-1,+1, so we just need to know if the sum tends to infinity or to a finite number. (There's a standard formal argument to that effect taught in Calculus 1). So, the worst case occurs when the whole sum is
    1+1/2^{1/2}+1/2+1/2^{3/2}+..., which is a geometric sum that adds up to a finite constant. Now, you can check that this argument works also for the sum 1+1/2^p+1/3^p+... for p>1/2, and for p=1/2 it stops working. I don't have a back-of-the-envelope proof that for
    p<1/2 the sum ceases to converge.

    By Anonymous Anonymous, at 3:43 AM  

  • Elad,
    Thanks! I believe your reasoning on the random series is 'essentially correct', i.e. will lead to a valid proof. It's also the kind of resolutely finitistic proof I tend to think of, as opposed to the more analytic proofs that predominate in probability theory.

    On the other hand, I disagree with you on the random walk question--the walk will diverge properly to +infty with certainty, never visiting almost all of the negative integers.

    To see this, consider the random walk's associated sequence of pluses and minuses. Let b_t = fraction of pluses in the first t steps. If x_t denotes position at time t, for x_t to be negative we must have b_t less than 1/3.

    But by the law of large numbers, {b_t}, the average of t independent Bernoulli variables, converges with certainty to the distributional average b = 1/2 (since plus, minus are equiprobable). So beyond some point, no more negative x_t.

    There's a generalization of this result that I recall from the same seminar: any space/time-homogeneous random walk with bounded step length and positive expected displacement on a step, diverges properly with certainty. You can probably relax the boundedness assumption somewhat.

    Here's what I believe the asymptotic density of hits will be, taken over the nonnegative integers: 1/2 + p, where p denotes the probability that the original walk hits -1. If someone can compute p, I think I can supply the rest of the argument.

    By Anonymous Anonymous, at 10:16 AM  

  • Strangely enough, just now I peeked at IBM's 'Ponder This' puzzle page, after a year's inattention, and I found the very same problem:

    http://domino.research.ibm.com/Comm/wwwr_ponder.nsf/challenges/April2007.html

    Haven't looked at the solution, though--I hate spoiling puzzles...

    By Anonymous Anonymous, at 12:30 PM  

  • http://domino.research.ibm.com/Comm/wwwr_ponder.nsf
    /challenges/April2007.html

    By Anonymous Anonymous, at 12:31 PM  

  • http://domino.research.ibm.com/Comm/
    wwwr_ponder.nsf/challenges/April2007.html

    By Anonymous Anonymous, at 12:31 PM  

  • check that you don't insert a space into the url by the line break...

    By Anonymous Anonymous, at 12:33 PM  

  • Andy, maybe we're having some mutual misunderstanding.

    You said: If x_t denotes position at time t, for x_t to be negative we must have b_t less than 1/3.

    I don't agree. For x_t to be negative, I think you must have b_t less than 1/2.

    By Anonymous Anonymous, at 6:43 AM  

  • OK, maybe we are talking about different problems.

    My understanding is that plus steps take the walker +2, while minus steps take the walker -1. So, if P, M are the number of plus steps and minus steps resp. after time t, we have x[t] = 2P + (-1)M, so for that to be negative we need M > 2P, or P < M/2, which means that the *fraction* of steps which are plus is at most 1/3 (since it is at this threshold that plus steps are outnumbered 2 to 1).

    By Anonymous Anonymous, at 9:53 AM  

  • Yes Andy, I am the George Dahl from Swat. I enjoy your blog.

    By Blogger G, at 1:21 AM  

Post a Comment

<< Home