File Permissions Lost by FTP?

I am trying to transfer the contents of a CD from a Macintosh to a Linux server via FTP. The files all transfer successfully except that those with execute permission on the CD no longer have execute permission on the Linux server.

My experience with both FTP and Linux is limited, so I’m probably missing something obvious here. I assumed that the file permissions would be preserved. Why would the permissions change?

The owner of the files on the CD is root, but when I connect to the Linux server with the FTP program I use my own user ID. Would that cause the loss of the execute permission?

Could the FTP server be configured in a way that is causing the permissions to be lost?

By the way, reversing the process and transfering a file with execute permission on the Linux server to my Mac causes the file to lose it’s execute permission too.

Is there a way to preserve the file permissions?

Nope, nothing wrong wuth FTP. It only transfers the file contents and not any “meta” infomation like file permissions.
I’m not a Mac guru but one of these should work.

  1. Use a utility like “tar” to make a copy of the CD, transfer the tarball and untar it on the Linux box. This will preserve the file permissions.
  2. “rip” the CD on the Mac to get an ISO image file. Not sure what software you could use to do this on the Mac but I’m sure it exists somewhere. Transfer the ISO image file, The mount the ISO image file using the loopback device. Viola, everything just as it appeared on the original CD including the directory structure and file permissions.

Not true. A good FTP program (I use lftp) can read the file permissions on the server side, and then duplicate them on the local copy. However, this is not the default behavior, because in general just want the copied files to have their usual default permissions, according to their umask. Note that very simple FTP clients (like, say, your web browser) won’t have this functionality. Whatever client you’re using, the option to preserve permissions should be listed in its documentation.

Dada is right, the permissions will not be preserved with FTP.

How many executables are there on the CD? Your FTP client may have an option to fix the permissions after you upload (WSFTP does), or of course you can always telnet into the server and manually set the permissions, which for me is faster and easier using wildcards.

I didn’t know that Some Guy, as I rarely use GUI ftp programs any more and most of my FTPing is done textually through a command prompt or telnet window.

In any case it’s not a true function of FTP to preserve file permissions, though I suppose it would be trivial for a GUI client to read permissions first, upload the file, and then modify the permission on the server side.

Thanks for the heads up :slight_smile:

Really? I use an excellent FTP program and file permissions don’t duplicate.

Then again, if you’re FTPing between machines with different operating systems, how can file permissions duplicate?

Methinks this would be a security hole if it were possible.

Like I said, check the documentation for your FTP program - it’s probably got the option in it, though it may be buried under “mirroring”, since that’ts the most common use for such functions - after all, if you only send one file, you can just chmod it. Linux and MacOS X both use Unix-style permission bits, with the same exact meaning (f’rinstance, both OSes will understand a permission string of 0x755 to mean “only the owner can write to this file, but everyone else can read or execute it”), so they only have to pass a number to the downloading computer - either OS can then correctly interpret that number on the receiving end.

It’s only a security hole to the extent that having FTP enabled is a security hole - generally, if you’re allowing untrusted people FTP access to your machine, you only allow access to a very specific directory, and make certain it’s not in the command path.

Nope, not there. But that’s expected when the O/Ss are different.

As for FTPing between *nix flavors, again no go, at least for us at work. It would be a security hole.

Thanks. :slight_smile:

Well, of course, if you’re at work, the definition of a security hole is that which causes your IT guy to say “that’s a security hole” - if they don’t like it, that’s the end of the story.

:stuck_out_tongue:

Yes, the permissions are inherent in the filesystem and the OS. Unless your mac supports unix-style file permissions, you’re stuck.

What you CAN do is, as you’re logged into the server and about to start copying, try sending the command
umask 022
That will set the default permissions on any file created to 755, which is the same as -rwxr-xr-x.
Then send the files.
If that doesn’t work, ssh (I almost never use telnet) to the linux box, cd to wherever you put the files, and do this:
chmod 755 *
to set everything world-readable and -executable, writeable by owner.

Thanks everyone.

I am transfering literally hundreds of files in a deeply nested directory structure. I had thought of using chmod, but with so many files, I will leave that as a last resort.

The FTP server does not seem to understand the umask command. My FTP client (Fetch) does have a mirroring option. I may give that a try. I also like Dada321’s idea of using the tar utility. I may also try FTP from the command line on my Mac; I’ve never done that before and am not sure what capabilities it has. I’m just beginning to explore the Unix side of OS X.

The tarring or zipping idea is certainly a good one. Not only will it solve the file permissions issue, it will vastly speed up the transfer if it is alot of files in a deeply nested directory structure that you indicated.

FTP negotiates a new connection for each file sent, and has to do directory creation and navigation between alot of the files. There will be a pause between each step while the client waits for the server to confirm that the last step has completed. Even when moving large directory trees between two machines on the same 100Mb network, it is much faster to zip/tar the package of files before transferring them. On a remote connection, the file compression (if used) will speed things up even more.

well, even with a deeply nested directory structure, chmod is still your friend. Use the -R option to make mode changes directory-recursive.

chmod a+x * -R

as a last resort, of course.

I tried the tar idea this morning. It worked like a charm.

Thanks again! :slight_smile: