Since you insist, I will try to explain in more details why this is unlikely without some additional stuff like a new idea or technique.
Assume a bound of the form
2^l-3^s>\cfrac{3^s}{l^a} \quad (whenever \frac{s}{l} < \log_3 2)
where a is a strictly positive constant, to encompass all possible levels of strength (obviously, this cannot hold when a is too small). The above inequality can be derived from Rhin’s bound with a=13.3. At best, the optimal bound might be a=1+\varepsilon with \varepsilon > 0 arbitrarily small (and s sufficiently large), which corresponds to Roth’s level as @Collag3n puts it.
Assume the existence of a nontrivial cycle of length l starting at a positive integer n with parity vector v. Using these notations, we can write that
n = \cfrac{3^s n + \beta(v)}{2^l}
where s is the number of odd terms and \beta(v) the remainder numerator, so that
n = \cfrac{\beta(v)}{2^l - 3^s}.
First, let us consider the case of a circuit with parity vector v=1^{s} 0^{l-s}. This case is “easy” to rule out (by taking advantage of the power of Rhin bound) for two reasons:
- \beta(v) is minimized and its value is simple to express: \beta(v) = 3^s - 2^s;
- n gives rise to a sequence of s successive odd integers, so, using a well-known result of Terras, we have n \equiv -1 \pmod{2^s}, which implies n \geq 2^s - 1.
Putting everything together, we obtain
\cfrac{3^s}{l^a} < 2^l - 3^s \leq \cfrac{3^s - 2^s}{2^s - 1}
and
l^a > \cfrac{3^s}{3^s - 2^s}(2^s - 1) > 2^s - 1 \sim 2^{\,l \,\log_3 2}.
Whatever the value of a, the LHS is growing much slower than the RHS. So it is not difficult to conclude the proof using Eliahou’s lower bound on l. In fact, it would have been easier to use Ellison bound, instead of Rhin, since it is stronger for small l and s.
Here, I have to mention the magical trick found by @mathkook in his no-circuit proof who managed to get through by using only the trivial bound n >1. His idea was to obtain another integer expression where \beta(v) has been replaced by something much smaller (close to 2^{l-s}).
Now, in the general case of a cycle, things are much more difficult:
- the general expression of \beta(v) is rather complicated and we only have upper bounds of the form \beta(v) < c \,s \,3^s for some constant c>0 (when n is the smallest term of the cycle);
- we do not have a lower bound on n except n > N where N is the current record for the computational verification of Collatz.
In the end, we get
l^a \, s > \cfrac{N}{c}
which is inconclusive, even for ridiculously small a values. It only implies a lower bound on l (when assuming an optimal exponent a \simeq 1, we get a lower bound approximately of the same magnitude as in Eliahou’s paper). So, we are stuck.
In fact, it would be great if we had a lower bound on n growing exponentially fast with l, like in the circuit (aka 1-cycle) case. In the case of an m-cycle, such lower bound can be obtained, but the growth rate depends on m and decreases exponentially fast as m increases. As soon as m gets larger than, say 1000, it becomes useless.
So, is it hopeless? Not completely. There is numerical evidence that such a lower bound on n holds for any sequence starting at n with a parity vector v of length l and weight s \approx l \log_3 2:
n \geq \cfrac{\alpha^l}{l} \quad with \alpha =1.035\ldots
even if it’s not a cycle, provided that n is an integer. The exact value of \alpha arises from a simple heuristic reasoning based on Terras/Lagarias results and is a particular case of a more general hypothesis that has been already investigated in this paper. The reasoning may seem a bit naive and far-fetched, but surprisingly it works pretty well computationnally for any l. Moreover, this lower bound would be sufficient to settle the case of nontrivial Collatz cycles. I don’t think that this lower bound is easy to prove, though…
Another approach might be to follow @mathkook’s idea dealing with redundancy in parity vectors and try to replace \beta(v) by something smaller. It’s still too soon to say how far we can go that way.
Well, this was quite an extensive post. Writing it down helped me clarify my thoughts. And can inspire some reader as well.