I have in my mind a thought experiment which I cannot figure out.
Imagine a voltage source working on let's say 1 GHz. One end is connected to a ground, which is very small (we can abstractly assume it is just a point in space) and keeps constant voltage, 0V. The other end is connected to infinitely long single wire.
I wonder how different parameters like input impedance, radiation, voltage and current propagation are gonna look like.
If there were two paralell wires, there would be an electromagnetic wave travelling between them, no mistery about that.
Is there gonna be any propagation in a single wire case? If yes, I would be pretty sure that it would include a lot of radiation and, at some point, all the energy would be radiated off. The question is, how far can the signal go? Also, I'm wondering what would be input impedance in such a case? It would have to be resistive if radiation occurs, am I right?
I'm getting really mixed up about this because I know that propagation over a single wire is impossible. Is it really? Or is it impossible because of radiation? What I mean is that maybe (that's what I don't understand) voltage can propagate over a single wire but at some point it would radiate all the energy off so effectively, propagation over single wire can be regarded as "impossible", because we have very strong limitation - radiation.
I get even more confused when I think about it from the perspective of Maxwell equations. I know that close to the source, there would be E vector between two ends of a source. How would E vector look if the voltage propagated over a single wire far away (like 1 km)?
I hope some of you could clear things up for me :)