Best way to view huge text files?

I need to view very large text files (SMTP logs) on a regular basis - sometimes up to 2GB in size.

Notepad is pretty shaky when you get over a certain size, and flat out refuses to open anything over 1GB.

I looked at an app called TextPad which is apparently very good for opening large text files but in the case of the 2GB file I think I might need something that splits the file into smaller pieces that I can open.

I’d really like to find something free, if possible.

I just use WordPad for anything too big for Notepad.

Nope, still too big. Something as big as 2GB tends to freeze and crash even WordPad.

Yikes. I don’t have any suggestions, other than trying a few of the text editors on offer in the freeware section of snapfiles.com and at freewarehome.com.

Is it possible to get the server to chop the logs smaller?

Textpad is a pretty good free editor but you’re right, there’s no magic way to easily view very large files.

So, if you’re comfortable with *nix-style tools, or the idea of learning how to use some, I recommend cygwin (free). Then it’s a simple matter of typing in “tail -10000 filein.txt > fileout.txt” to split off the last 10000 (or whatever) lines into a new file.

And lots other useful stuff like grep and whatnot.

You can try Ultraedit for free for 45 days

http://www.ultraedit.com/index.php?name=UE_MoreFeatures

and see if it does the job. Claims 4 GB file size capable

UltraEdit is a good suggestion. Another one in its class (30 day trial) is EditPlus (no claim as to file size – But I haven’t run into any trouble well into the Hundreds of MBs range)

One that is completely FREE is ConText, which claims an “unlimited” file size :dubious:

Good luck! :slight_smile:

With files that big, I really have to ask what you’re planning on doing with these files. If you want to make some change to every line, according to some pattern, a text editor program isn’t what you want at all (unless you plan to spend a few centuries doing it all by hand): For that, you want a text filter program, like sed or awk (Unix command-line programs; you’ll need to spend some time learning to use them). If you have to make extensive changes, but only in a few places, you have the problem of finding those places, whether by line numbers or by matching some sort of pattern. Almost all text editors can find and scroll to a particular word, but only some give you the option to “go to line 27649”, or the like.

Unless you’re on a really high-end machine, that file is bigger than your RAM, and probably bigger than your virtual memory as well. Switching editors isn’t going to help. What do you need from it?

Thanks for the suggestions, all. I checked out UltraEdit but it does way more than I needed - I just needed to read the files. In order to read a text file of that magnitude in Windows w/o hogging ALL of one’s resources, one needs to have it chopped into smaller bits.

Windows, I’ve found, isn’t very good at opening large text documents or displaying folders with a lot of contents.

I found a little app called Chop that did what I needed - I told it to cut the 2gb file into 95 30mb files, which I was then able to view one at a time in WordPad.

I had looked at others (like TextPad) but Chop was the cheapest ($7.50) one that let you tell it how much memory to use during the process.

Sorry, should have been clearer in my OP as to what I needed to do - view or split & view.

Thanks, though!

Is this something that you can use automation for?

eg: What are you looking for? There are plenty of log management tools out there that will let you set triggers like “show me all SMTP messages coming to us from xxx domain” or “show me all messages that appear to have spoofed headers” etc… and serve up very manageable reports that would probably be printable on a single sheet of paper.

Since when is RAM size plus virtual memory the absolute limitation? Ever heard of paging (thru a hard drive temp file)? Wordstar had that in the 1970’s when 32KB was a lot of RAM and very few files fit into it.

I don’t know what the limitations of MS Word are, but can’t it handle files of nearly unlimited sizes?

Sorry, I wasn’t clear. A file that’s bigger than RAM + virtual memory can be read, but not quickly. That’s what I was trying to point out: the hardware can only read a multi-GB file so quickly, and switching software won’t affect that.

Maybe notepad++? It’s free and open source.
http://notepad-plus.sourceforge.net/uk/site.htm

You mean “the Windows applications that I’ve tried aren’t very good at opening large text documents”. Operating systems don’t tend to care how big a file is when they open it.

Just open the Command Prompt (a.k.a. DOS window pre-Windows 95) and enter:

more filename.txt

If you are looking for specific data in the file (I’m sure a 2GB SMTP log isn’t something you just read for leisure), you can use the find and findstr commands. Enter a /? after these commands for info on syntax, usage and options.