COOKIES: By using this website you agree that we can place Google Analytics Cookies on your device for performance monitoring. |
University of Cambridge > Talks.cam > Probability > Right-Most Position of a Last Progeny Modified Branching Random Walk
Right-Most Position of a Last Progeny Modified Branching Random WalkAdd to your list(s) Download to your calendar using vCal
If you have a question about this talk, please contact Sourav Sarkar. In this talk, we will consider a modification of the usual Branching Random Walk (BRW), where we will give certain independent and identically distributed (i.i.d.) displacements/perturbations to all the particles at the generation $n$. We call this process last progeny modified branching random walk (LPM-BRW). Depending on the value of a parameter, $\theta > 0$, which works as a “scale parameter’’ for the perturbations, we will classify the model in three distinct cases, namely, the boundary case, below the boundary case, and above the boundary case. Under very minimal assumptions on the underlying point process of the increments, we will show that in the boundary case, the maximum displacement converges to a limit after only an appropriate centering, which will be the form $c_1 n – c_2 \log n$. We will give explicit formulas for the constants $c_1$ and $c_2$ and will show that $c_1$ is exactly the same, while $c_2$ is $1/3$ of the corresponding constants of the Classical BRW [AĆdekon 2013]. We will also be able to characterize the limiting distribution as a randomly shifted Gamble distribution. We will further show that below the boundary the logarithmic correction term will be absent, while for above the boundary case, the logarithmic correction term is exactly same as that of the classical BRW . If time permits then we will also show that Brunet-Derrida-type results of point process convergence of our LPM -BRW to a Poisson point process also hold. Our proofs are based on a novel method of coupling the maximum displacement with a linear statistic associated with a well-studied process in statistics known as the smoothing transformation. [This is a joint work with Partha Pratim Ghosh] This talk is part of the Probability series. This talk is included in these lists:
Note that ex-directory lists are not shown. |
Other listsBest Massage Chairs Cambridge Analysts' Knowledge Exchange Cambridge Neuroscience Seminar, 2011Other talksThe dynamics of motivation: The neural and computational mechanisms of effort when treated as a cost or a benefit Neo4j: GenAI with LLMs and Knowledge Graphs Afternoon tea Blown away: relative velocities and the first galaxies Quantum Melting of Spin 'Solids' in 2D T-duality of 6d (2,0) and (1,1) Little String Theories with Twist |