I see. I never download stuff into my home directory. In fact I have a separate area that I put ISOs in. I have a work directory called "/work" and under it I have several directories, one of which called "iso", under that I have several directories like "SuSe", "RedHat", etc. I put the SuSe files in "/work/iso/SuSe" to keep everything clean. Which directory were you in when you started the "ncftp" command? That's the directory your file would download to.
For instance, before downloading the SuSe ISOs I changed into "/work/iso/SuSe" by typing "cd /work/iso/SuSe" then run the ncftpget command in the earlier post, then I start the "ls -l" loop to watch the directory. You should see the "su8000_001.iso.gz" filename appear once ncftp starts downloading it. And every 5 seconds it does another directory listing so you can watch the file size increase.
If there is an error it will go to the "/tmp/suse.out" file if you used the ncftpget command that I posted in my message. If you wanted to keep an eye on the contents of that file you could, in another window, type "tail -f /tmp/suse.out". If there are any error messages in the file they would have popped up on the screen. Any time a new error is entered into that file it will immediately show on your screen. The file should be completely empty if things are working properly and should stay that way.
Of course all of the above may sound "extremely" complicated but after you get the hang of doing things this way it becomes second nature and easy. And it will all work the same whether you are working in a local shell or on a remote shell on a server on the other side of the earth.
You could always use a graphical FTP client but then you tie up part of your GUI for hours unnecessarily and have to stay logged in. Running things in the background gives you freedom.