Highly advanced technology doesn’t allow you to get around the laws of the universe. In this case the issue is going to be signal to noise. Even with a large radio telescope the effective radiated power isn’t huge, and the signal strength drops with an inverse square. So, given the power radiated and gain of the telescope you can calculate the power per unit area available at the target. Then you can work out the best possible noise margin. So, assume a perfect receiver, one at absolute zero, and assume that the only noise present is the 3K thermal background. Knowing the transmission frequency you can work out the noise margin. Then find the bandwidth of the signal. Finally you need the information rate of the transmitted signal. (This isn’t the transmitted data rate, but rather, what is the actual information content, and how long was the signal transmitted for. That takes care of redundancy.) Even if you transmitted solid carrier for an hour, that has a data rate (essentially one bit per hour.) Shannon tells us whether the information gets through. If the noise margin times bandwidth is less than the data rate, you can’t detect the signal. (Redundancy in the signal helps the aliens, they can see correlations in the signal, and improve the signal to noise - a step known as processing gain. But that is still included in the above if we simply ask what the fundamental, not encoded data rate, is.)
I very much doubt there is the slightest chance they can detect anything. Even with an antenna the size of a planet.