7 ways of transferring files from remote server to localhost

Whether you are developer, devops guy or a sys admin in charge of multiple servers, you'll always find yourself wanting to move files from one machine to another. It's not an easy feat for first-timers, but it gets easy with a bit of experience.

There are several ways of moving files and directories such as a newly developed website or backup from your local machine to a remote server and vise versa. Different methods use ssh, ftp or http protocol to achieve the same. I prefer SSH method since there's increased level of security while moving your data.

scp

This is probably my favorite when moving files from one box to another. scp using ssh behind the scenes which means you get added encryption whilst moving files.

The general syntax is;
scp from-source to-destination e.g
scp dave@my-remote-box.com:/home/dave/file.txt /home/dave/docs

If you are copying directory, don't forget to add the -r option.
If you wish to specify ssh port of the remote server, then add the -P option (notice it's uppercase). So if I were copying a directory from remote box whose ssh port is 1234 to my local machine, then that would be;

scp -rP 1234 dave@my-remote-box.com:/home/dave/file.txt /home/dave/docs

You'll be prompted for ssh password eachtime you make the transfer. However, if you setup passwordless ssh, then you won't be prompted everytime.

rsync

rsync is probably my second favorite method of copying files between two computers. It would have been my favorite if it wasn't for some potentially dangerous gotcha newbie might fall for I'll discuss later.

The advantage of rsync over the other options is that is copies only the files/directories that have changed. It does copy the entire directory eachtime you initiate a transfer. Therefore it's best for incremental backups and pushing website or application code changes to production box. It's sort like git but without version control. It's really cool.

Now if you are copying files within the same machine or from an external hard drive to your machine, you could use rsync plainly the way you use cp. e.g

rsync from/source/file to/destination/dir.

But if you wish to copy files from remote pc whose ssh port is 1234, then you cause ssh together with rync to achieve this like this;

rsync -have "ssh -p 1234" dave@my-remote-box.com:/home/dave/remote-file.txt /home/dave/docs

You can read more about rsync options here but generally the 'have' option means;

-h: output numbers in a human-readable format. -a: It is a quick way of saying you want recursion. and want to preserve almost everything. -v: give you information about what files are being transferred and a brief summary at the end. -e: shell option.

If you are syncing remote directory with local one, then use the same syntax except add a leading slash at the end of directory path of the source directory like this;

rsync -have "ssh -p 1234" dave@my-remote-box.com:/home/dave/remote-dir/ /home/dave/local-dir

Now the gotcha is if /home/dave/remote-dir/ is empty (for instance you mistakenly use a different empty dir such as /home/dave/remote-empty-dir) and you rsync, it'll empty the destination dir /home/dave/local-dir. So very careful about the ordering of the directories.

ftp

My good-old-friend ftp is still very much used today as it was in the beginning of the internet days. FTP uses plain-text authentication by default, however, today there are ftp implementation that use SSL protocol such as ftps or sftp to add an added layer of protection. I encourage you to use either of them.

Web developers and web hosting companies majorly like to use ftp for file transfer. Devs default to using a multitude of ftp clients to copy files. My favorite ftp client is Filezilla which is pretty awesome, but there are alternatives in the wild such as;
* Coreftp, * WinSCP * Transmit (Mac OS X) * FireFTP (All Platforms with Firefox) * Cyberduck (Mac OS X).

When you want to copy files from one remote server to another remote one, then you might not have the option of using GUI ftp apps unless you first download to your local machine then re-upload to the destination box. To save time and bandwidth, I use ftp commandline client.

So let me share some ftp commands thanks to awesome guys at cyberciti.biz;

$ ftp my-remote-box.com ftp> ls
ftp> cd dirName
Task: Download / Copy file
To copy one file at a time from the remote ftp server to the local system use get command:

get fileName
get fileName newFileName

To change directory on your local system, enter:
ftp> lcd /path/to/new/dir

Print local directory:
ftp> lpwd

Task: Download Multiple Files

To download all files, enter:
ftp> mget *

To download all perl files (ending with .pl extension), enter:
ftp> mget *.pl

Task: Upload One File
To copy one file at a time from the local systems to the remote ftp server, enter:
ftp> put fileName

Task: Upload Multiple Files

Upload all files from the current system:
ftp> mput *
ftp> mput *.pl

Task: Create a Directory
To make a new directory, enter:
ftp> mkdir dirName

Task: Delete a Directory
To remove or delete a directory, enter:
ftp> rmdir dirName

Note: recursive copy doesn’t work. Use wget instead like this;

wget -m --user="dave" --password="my-secure-password" ftp://ftp.my-remote-box.com/remote-dir.

That will copy remote ftp directory from my-remote-box.com using ftp username dave and password my-secure-password. This really is a pro-tip since unfortunately ftp commandline won't recursively copy directories.

wget

The amazing wget is the download manager that rules them all. Nothing beats wget when you want to copy files from remote box to localhost. The only challenge is that you can't move from localhost to remote though. It's a one-way trip, but an awesome trip nonetheless.

Downloading a file from remote server is as simple as giving wget a link to download.

wget http://my-remote-box/remote-file.txt

To recursively download a directory using wget enter;

wget -r --no-parent http://my-remote-box/remote-dir.

The Parameters mean:
-r //recursive Download and
--no-parent // Don´t download something from the parent directory

//To download a whole website with wget. wget --random-wait -r -p -e robots=off -U mozilla www.example.com.

You can visit explainshell.com to get a break down of this long command. You can visit labnol.org for more examples on how to use the magic of wget or even gnu.org.

Samba/smb

Finally we visit our very good-old-friends. Almost as old as FTP and the internet itself. Samba uses the smb/cifs protocol majorly to share files between windows computers and Linux machines. There's a long dirty history there between Microsoft and FOSS community but now things seem to be working out pretty well now that M$ changed strategy in recent years.

Nonetheless, samba is probably your best option of sharing files between a linux server and windows machines withing the same LAN network. It does NOT work over the internet.

After you have setup a samba server which is beyond the scope of this post, you can access remote share in the following ways;

On the windows machine type this at run:

\\ip-of-the-samba-pc\shared.

Take note of the forward slashes.

On Linux/ubuntu machine open file browser and type this at the location bar;

smb://ip-of-the-samba-pc/shared

Take note of the back slashes.

You may also be able to access files by directly specifying their names in the form \\HOST\SHARE\PATH\FILE, where HOST is the computer’s NetBIOS name, SHARE is the name of the share, and \PATH\FILE is the path to the file relative to the share’s root.Take note that while accessing from windows you must use the samba ip or netbios name rather than the samba server domain name self!

NFS

Network File System or NFS is my favorite method for sharing files between two Linux machines on the same LAN. I discovered Samba shares can’t write file name with special characters. Specifically I noticed this while working with email files such as 1451825736.V68I5391eb4M572897.example.com:2,.

NFS works much like Unix File system while Samba works more like Windows.

Usually I mount a remote NFS share on local machine after setting us the NFS server. Once the remote share is mounted locally, it behaves like any other directory on your local machine making copying files really easy.

netcat and python

You can create a simple webserver using python command for all files in the current directory (and sub-directories) and make them available to anyone on your network. You might have to open firewall rules for particular port you are using but by default the command defaults to port 8000. You can ofcourse restrict access to only particular IPs. I use this trick a lot when a client requests their website or mail files from a server.

python -m SimpleHTTPServer 9000 (for port 9000 and Python 2) and python -m http.server 9000 (for port 9000 and Python 3).

I find using python command really simple and easy to remember, but you can use netcat as well.

//transfer files using netcat At server-side
nc server.ip 9999 > remote-file.txt or cat hugefile.ext | nc -l -p 9999

At client-sise
nc -l -p 9999 < remote-file.txt

So there you have it; 7 ways to transfer files between two remote machines. The options are dependent on whether the machines are on the same network or are connected via the internet, whether they are both of the same OS(Linux) or Windows/Mac and Linux, if security is paramount or not, whether I need a quick dirty solution or if am looking at long-time remote file sharing.

Image: gcn.com

David Okwii

David Okwii is a Ugandan-based Technology writer and System's Engineer.

Kampala Uganda http://www.davidokwii.com

Subscribe to oquidave@geek:~ #

Get the latest posts delivered right to your inbox.

or subscribe via RSS with Feedly!