Linux gurus: question re: disappearing files/folders

I’ve got a problem with a linux box running an oldish version of Redhat (it’s version 7). Some of my files and folders have disappeared. They still exist–the web server pulls the files and I can overwrite them just fine.* I can cd to the folders from the command line (and most of their contents show up). However, they don’t show up in the listing when I ls or ftp in. Any ideas what the heck is going on?

  • When I overwrite them, however, they still don’t show up in the directory listing.

Do their names begin with a period, by any chance? ls usually doesn’t list names that begin with a dot, simply by old *nix convention (such names have been reserved to important configuration files that would do well to be semi-hidden from the users).

Other than that, I’m pretty well stumped. But I’m by no means a `guru’.

Nope. Normal file names. Things like “images” or “news.txt” (minus the quotes, natch).

If you do “ls -l” there’s more shown. Maybe if you do that you’ll see more.

That’s the letter L lowercase after the dash.

Try using ls -l | less' (that is, ls dash lowercase-ell pipe less’) just to see if you aren’t somehow missing some entries. I’m beginning to suspect some oddities in how your terminal is handling entries that just don’t fit.

If that doesn’t work, I’m well and truly stumped.

What are the permissions of the subdirectory they exist in? You must have the ‘read’ bit set on a directory in order to list its contents. You can still access its contents, though, if it has the ‘execute’ bit set.

I second directory permissions. If you are accessing the files remotely using a web browser, it is probably being viewed by the uid/gid running the httpd process. This would be different than your uid/gid when your telnet/ssh/scp/ftp the files.

The user for httpd processes probably has permission to read the directory but you, the individual user, do not.

Worth looking into.

Ah, sounds like Terminus Est might have nailed it.

chmod * +r

would fix that, right?

That would fix the permissions of the files in the directory. What you want to do is fix the permissions of the directory itself.

$ chmod ugo+r /directory/name/here

Adjust “ugo” as appropriate to local security policy.

Mystery solved. Sorta. Looks like we got freaking hacked. Now the feds have our server…luckily we’ve got a replacement with updated OS on the way that should be done overnight…with cough the security holes hopefully plugged…

Anyway, thanks for the help–I really appreciated it.

You’re kidding, right? The box is important to you and you are hoping that it comes secure from the vendor? They don’t come secure from the vendor.

Take some proactive measures. It is not impossible to secure a web server. Not black magic, either. can get you started.

You’ll never have a hack proof server if it accepts any input, but you can make it secure enough that hackers look for easier targets. If someone is determined to get into your box, and does, intrusion detection and forensic tools are important. Tools such as tripwire are a good start. Logging to remote machines (just send your message and log files somewhere else) is another good strategy.

If you are too overworked or not confident in your experience to secure a web server that is important to your business, hire a consultant.

But don’t rely on default settings to protect critical information. Security strategies like that keep script kiddies in bragging rights.