2010
DOI: 10.1007/s00026-010-0062-5
|View full text |Cite
|
Sign up to set email alerts
|

Separation Cutoffs for Random Walk on Irreducible Representations

Abstract: Random walk on the irreducible representations of the symmetric and general linear groups is studied. A separation distance cutoff is proved and the exact separation distance asymptotics are determined. A key tool is a method for writing the multiplicities in the Kronecker tensor powers of a fixed representation as a sum of non-negative terms. Connections are made with the Lagrange-Sylvester interpolation approach to Markov chains.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2016
2016
2022
2022

Publication Types

Select...
4

Relationship

1
3

Authors

Journals

citations
Cited by 4 publications
(3 citation statements)
references
References 31 publications
0
3
0
Order By: Relevance
“…Note that it takes n − 2 iterations of the Markov chain K to move from t to t ′ . By Proposition 4.1, K has n − 1 distinct eigenvalues (one more than the Markov chain distance between t and t ′ ), so it follows from Proposition 5.1 of [16] that…”
Section: Proof Of Main Resultsmentioning
confidence: 99%
“…Note that it takes n − 2 iterations of the Markov chain K to move from t to t ′ . By Proposition 4.1, K has n − 1 distinct eigenvalues (one more than the Markov chain distance between t and t ′ ), so it follows from Proposition 5.1 of [16] that…”
Section: Proof Of Main Resultsmentioning
confidence: 99%
“…In Section 4 we specialize our results to the tensor product Markov chains studied in [Ful08,Ful04,Ful10,BDLT19]. The result is that the tensor quasi-randomness of a group G characterizes whether certain tensor product chains mix in constant time.…”
Section: ) G Has Neither An O(1)-size Nor An Abelian Non-trivial Quot...mentioning
confidence: 99%
“…Corollary 2.2 is very similar to Proposition 2 of [8], but our result is cleaner, as it does not involve associated Stirling numbers of the second kind. For another approach to the decomposition of tensor powers of ̺, see [6].…”
Section: Decomposition Formula For Tensor Powers Of ̺mentioning
confidence: 99%