Rainbow powers of a Hamilton cycle in Gn,p

We show that the threshold for having a rainbow copy of a power of a Hamilton cycle in a randomly edge colored copy of Gn,p ${G}_{n,p}$ is within a constant factor of the uncolored threshold. Our proof requires ( 1 + ε ) $(1+\varepsilon )$ times the minimum number of colors.


Introduction
There has recently been great progress in our understanding of thresholds for monotone properties in the random graph G n,p .Inspired by the work of Alweiss, Lovett, Wu and Zhang [1] on the Sunflower Conjecture, Frankston, Kahn, Narayanan and Park [4] showed that under fairly general conditions, the threshold for the existence of combinatorial objects is within a factor O(log n) of the point where the expected number of such objects begins to take off.Great though these results are, this is not the end of the story.In a paper remarkable for the strength of its result and for the simplicity of its proof, Park and Pham [10] proved the so-called Kahn-Kalai conjecture [8] which implies the result of [4].
Kahn, Narayanan and Park [7] tightened their analysis for the case of the square of a Hamilton cycle, removing the O(log n) factor and solving the existence problem up to a constant factor; a remarkable achievement, given the complexity of the proofs of earlier weaker results.Their result was generalized by Espuny Díaz and Person [3] and Spiro [12], both of whom defined more generalized conditions under which the O(log n) factor can be removed.Espuny Díaz and Person asked whether a rainbow generalization of their result could be proven [3].Our main theorem here proves a rainbow version in a setting that is more general than the Kahn-Narayanan-Park result but less general than Espuny Díaz-Person and Spiro results.It is likely that our result could be extended to the full generality of the Espuny Díaz-Person and Spiro results with some additional effort.Some notation Given a set X and 0 ≤ p ≤ 1, we let X p denote a subset of X where each x ∈ X is placed independently into X p with probability p.Similarly, X m is a random m-subset of X for 1 ≤ m ≤ |X|.
Let H = {A 1 , A 2 , . . ., A M } be a hypergraph on vertex set X. A key notion in this analysis is that of spread.For a set S ⊆ X we let ⟨S⟩ = {T : S ⊆ T ⊆ X} denote the subsets of X that contain S. We say that H is κ-spread if H is called r-bounded if |A| ≤ r for all A ∈ H and r-uniform if |A| = r for all A ∈ H.The following theorem was proved in [4]: Let H be an r-bounded, κ-spread hypergraph and let X = V (H).There is an absolute constant then w.h.p.X p or X m respectively contains an edge of H.More precisely, P(X p contains an edge of H ) ≥ 1 − ε r where ε r → 0 as r → ∞.
To apply this theorem to, say, Hamilton cycles, we let 2 )︁ and we let A i , i = 1, 2, . . ., 1 2 (n − 1)! be the edge sets of the Hamilton cycles of K n .
In the special case of H corresponding in this way to the squares of Hamilton cycles, [7] removed the log rfactor from the bounds in (2).
We now turn to the main topic of this note.We suppose that each x ∈ X is uniformly and independently given a random color from a set Q. Given a set A ⊆ X we refer to A * as the set after its elements have been colored.We say that A * is rainbow colored if each a ∈ A has a different color.Bell, Frieze and Marbach [2] attempted to extend the results of [4] to rainbow colorings.They proved Theorem 2. Let H be an r-bounded, κ-spread hypergraph and let X = V (H) be randomly colored from Q = [q] where q ≥ r.Suppose also that κ = Ω(r), that is, there exists a constant C 0 > 0 such that κ ≥ C 0 r for all valid r.Then given ε > 0 there is a constant C ε such that if r is sufficiently large and then X m contains a rainbow colored edge of H with probability at least 1 − ε.
The constraint κ = Ω(r) rules out the square of a Hamilton cycle as there we have r = 2n and κ = O(n 1/2 ).The aim of this note is to tackle this case while also removing the extra log r-factor.Unfortunately, we have to increase the number of colors slightly, by a factor (1 + ε 1 ) for arbitrary positive ε 1 .Chapter 15 of [6] extracts a property used in [7] to make the following extra assumption about the hypergraph H.For A ∈ H we let The assumption now is that there exist constants 0 < α < 1, K 0 independent of r such that As H remains κ-spread, it follows from (1) that Let a hypergraph H be edge transitive if for every pair of edges A i , A j there exists a permutation π : X → X such that π(A i ) = A j and such that π(A) ∈ H for all A ∈ H.The induced map π : H → H is a bijection.
(When H is defined by the edges of K n , all we usually require is a permutation of the vertices.) We will prove the following: Theorem 3. Let ε, ε 1 > 0 be arbitrary positive constants.Suppose that H is a κ-spread, r-uniform and edge transitive hypergraph on which (4) holds.Let X = V (H) be randomly colored from Q = [q] where q ≥ (1+ε 1 )r.
Then there exists C = C(ε, ε 1 ) such that for sufficiently large r, κ, We will show in Section 3 that hypergraphs corresponding to powers of Hamilton cycles fit the premise of Theorem 3. ( [7] verified (4) for squares of Hamilton cycles and for completeness, we verify (4) for all powers.)We prove Theorem 3 in the next section.We note that our proof is in some part inspired by a proof by Huy Pham [11] of the main result of [7].
2 Proof of Theorem 3 The proof will proceed in three stages.First, we will color all elements of X independently and uniformly at random from [q], and will remove all sets in H that are not rainbow.We show that the number of remaining sets is with high probability close to its expectation.
Then, let N = |X| and m = CN κ for sufficiently large C = C(ε, ε 1 ).Let W 0 be chosen randomly from N and let W 1 be obtained from X \ W 0 by including each element with probability p 1 .Proving Theorem 3 on W 0 ∪ W 1 suffices to prove it for X O(m) by standard concentration bounds.The second stage (succeeding with high probability) will deal with W 0 while the third stage (succeeding with probability 1 − ε) will deal with W 1 .
We will use the notation A ≲ B to indicate that A ≤ (1 + o(1))B as r → ∞.We will also assume that q = (1 + ε 1 )r.This assumption comes without loss of generality because C(ε, ε 1 ) will be strictly decreasing in ε 1 , so if q > (1 + ε 1 )r, we could set ε 2 such that q = (1 + ε 2 )r and use ε 2 in the proof instead.

The size of H *
Let H = {A 1 , A 2 , . . ., A M } and let H * denote the rainbow edges of H after a uniform and independent random coloring.Similarly, let X * denote X after it has been randomly colored.Let (a We use the Chebyshev inequality to prove concentration of Z = |H * | around its mean.We have as r → ∞, because spread (with S ∈ H in ( 1)) implies that |H| ≥ κ r and we have assumed that κ is sufficiently large.
Using the edge transitivity of H to obtain (7), ≤ E(Z) Explanation for (8): For the first sum we use (4) on A i and for the second sum we use spread by summing over all So, as long as κ, r → ∞.It follows that w.h.p.
Thus, for the rest of the proof we will assume

Random sample from X
Given a set A * ∈ H * , we define and for t > αr, we have by ( 5) that For , that is, if the majority of sets in H * have a relatively small T * .Lemma 4. P(success) ≥ 1 − c ω 0 for some constant 0 < c 0 < 1.
Proof.Let ν bad denote the number of bad pairs (A * , W * 0 ).Fix a function ϕ : 2 X * → H * , where ϕ(S * ) ⊆ S * whenever S * contains a set in H * .We claim that Explanation for (12): This equation follows from the key observation of recent threshold papers [7,10].We count the number of (A * , W * 0 ) with |T * (A * , W * 0 )| = t for a given t ≥ ω.We first fix Z * = T * ∪ W * 0 , which as these are disjoint has size m + t.Then, we let , and ϕ(Z * ) is a valid choice of B * , we must have T * ⊆ ϕ(Z * ) ∩ A * , and so t ′ = |ϕ(Z * ) ∩ A * | ≥ t.Given t ′ , we can specify one of the at most f * t ′ ,ϕ(Z * ) possibilities for A * as a superset of ϕ(Z * ) ∩ A * .We then specify T * ⊆ ϕ(Z * ) ∩ A * in at most 2 t ′ ways, which uniquely gives By linearity of expectation and Equations ( 10), (11), and ( 12), we get Continuing, and using ( 14), Then the above equation gives that and thus By taking ω → ∞ as r → ∞, this means that success will happen with high probability.

Finishing the proof
Suppose now that W * 0 is a success and then let R * denote the multi-hypergraph where each good (A * , W * 0 ) contributes one element.Then let We can assume that R * 0 = ∅, as otherwise W * 0 contains an edge of H * and we have already succeeded.Now, generate 0 satisfies Theorem 3. Thus, we just need to show that with probability at least 1 − ε there exists such an R * ⊆ W * 1 .
To aid in the calculations below, for each Let ν R denote the number of accepted sets.It suffices to show P(ν R = 0) ≤ ε, which we will do by Chebyshev's inequality.Then The claims in (15) follow from the fact that ω = o(r 1/2 ) and the fact that Now (The same assumptions (16) suffice to obtain (17).) Fix R * ∈ R * and then for 1 ≤ t ≤ ω, Explanation for (18): R * appears several times in R * as A * \ W * 0 for some A * ∈ H * .For each such A * we count the number of sets B * ∈ H * for which s = |B * ∩ A * | ≥ t.This will bound the number of choices for S * in the LHS of (18).For the sum we use (11) which is only valid for t ≤ αr.For larger t, we proceed as in ( 8) and K t 0 by ≤ (e/α) t and assume that K 0 ≥ e/α. So (We have used κp 1 = κm/N = C and C ≫ K 0 to get the third inequality.) The Chebyshev inequality implies that Taking C(ε, ε 1 ) ≥ 13K 0 εε 1 then verifies (6).(We use E(ν R ) → ∞ to justify the final conclusion.)

Powers of Hamilton cycles
We verify (4) for the hypergraph H whose edges correspond to the kth power of a Hamilton cycle.As in [7] we split this into two propositions and modify their proof for k = 2. 2 )︁ , with t ≤ n/3k edges, inducing c components, Proof.Let T 1 , . . ., T c be the components of the subgraph induced by the edges T and let v = |V (T )| where (V (A), E(A)) is the set of (vertices, edges) used by a subgraph A. The upper bound on t implies that no T j can "wrap around," and so |E(T j )| ≤ k|V (T j )| − (2k − 1) for each j and so We designate a root vertex v j for each T j and order V (T j ) by some order ≺ j that begins with v j and in which each v ̸ = v j appears later than at least one of its neighbors.We may then bound |H ∩ ⟨T ⟩| as follows.To specify an S ∈ H containing T , we first specify a cyclic permutation of {v 1 , . . ., v c } ⋃︁ ([n] \ V (T )).By (19), the number of ways to do this (namely, We then extend to a full cyclic ordering of [n] (thus determining T ) by inserting, for j = 1, . . ., c, the vertices of V (T j ) \ {v j } in the order ≺ j .This allows at most 2k places to insert each vertex (since one of its neighbours has been inserted before it and the edge joining them must belong to T ), so the number of possibilities here is at most (2k) v ≤ (2k) 2t , and the proposition follows.)︁ .(For u < t, the summand is the number of positive integer solutions to x 1 + • • • + x c = u.)Finally, we specify for each i a connected S i of size t i rooted at v i in at most ∏︁ c i=1 (2ke) t i ways.This comes for the fact that there are at most (∆e) t−1 rooted subtrees of the infinite ∆-regular tree, see Knuth [9]  )︃ t .

Final thoughts
Theorem 3 could possibly be improved in at least two ways.First, we could try to replace ε by o(1).
For specific examples such as the square of a Hamilton cycle, this can probably be done using the ideas of Friedgut [5], as suggested in [7].Also, we can try to replace ε 1 by zero, which would require an improvement to the proof in Section 2.3 that we do not have at the moment.

Proposition 2 .
For T ⊆ S ∈ H, |T | = t ≤ n/3k, the number of subgraphs of T with c components is at most(4ke) t (︁ 2t c )︁ .Proof.To specify a subgraph T of S we proceed as follows.We first choose root vertices v 1 , . . ., v c for the components, say T 1 , . . ., T c , of T , the number of possibilities for this being at most(︁ 2t c )︁.We then choose the sizes, say t 1 , . . ., t c , of T 1 , . . ., T c ; here the number of possibilities is at most ∑︁ ) yields the proposition.It follows from these two propositions that if S ∈ H and 1 ≤ t ≤ n/3k then