Needed: monster text editor

I’ve got a 14Mb text file that I need to delete one line from.

Anyone know of a text editor that could open this file easily, allow me to do the edit, and then save the thing without taking three years to do it/crash my machine?

I’m running W2000.

I always used a hex editor for large files. Hmm. Vim might do the trick.

It would’t pose a problem for emacs, at least not under linux. I’m fairly sure it would work equally well under windows though.

It is truly a ‘monster editor’, and can do everything except baking pizza. It takes a while to get used to, but personaly I couldn’t live without it.

Vim appears to be a Unix-based application. Do they make a Windows version?

I remember using a really handy text editor for Windows from about 6 years ago, but I’ve forgotten its name.

Sorry, lee, should have scrolled down. Thanks.

Will also look at emacs.

Cheers.

Notepad under Windows 2000 and Windows XP does not have any file size limitations that I’ve been able to find. I’ve opened hundred-meg plus sized files without significant problem (although it can be slow to open as it has to load the entire file into virtual memory, which may take a while).

The old “Notepad cannot open files over 40k” limit that people often believe in was a W98 limitation that didn’t survive into W2K.

I recently loaded an 11MB plain text file into MS Word and had no problems using it. Just re-save in plain text after making your edit (use a separate filename, just to be safe in case there are any problems). If you have enough memory I don’t see why Word couldn’t handle a 14MB file (assuming you have Word, of course).

Every word processor that I’m familiar with has some kind of “Save to text” feature.

KellyM, it was the time problems that stopped me from using Notepad.

I used the Windows version of Vim. Though the install is some ugly-ass Linux command-line style porting, and the GUI is hideous, it opened in seconds, saved in seconds, and didn’t dominate my CPU. Thanks a lot!

Addendum: Fourteen megs is not a very large file. I just created a 22 meg text file (1,000,000 copies of the same 24-character string) and opened it in Notepad. It took about 20 seconds.

Well I took your challenge, and I admit defeat: it took 15 seconds to open the file, 8 seconds per editing operation, and 15 seconds to save, using Notepad. Well I never.

Aren’t fast computers an amazing thing?

Because we run so much bloatware these days it’s easy to forget that your machine is inherently 400 or 500 times faster than an old XT. So if the task itself hasn’t changed (ie dirt-simple file open in Notepad), the results are 400-500 times faster, or said another way, 400-500 times the volume of data can be processed in the same time.

And considering that 14MB is a small fraction of most machines’ RAM these days, whereas before it might be more than the total VM, that means another speedup factor of 10x or 100x.

If you need to do this frequently, I would suggest learning the grep command. grep is a unix utility which has been ported over to the DOS world multiple times. I heartily endorse the one called xgrep, which I’m sure you can find by googling. To delete any lines containing your user name, the command would be something like:

xgrep -v “jjimm” infile.txt > outfile.txt

For someone who works with text files, this is an invaluable utility to know.

Atlantis Nova is pretty nice. Only 760k and freeware.
It runs amazing fast compared to Wordpad. I also use it because it has a lot more features than WordPad.

http://www.rssol.com/en/html/download/nova.htm

Indeed true; for some obscure reason, I loaded Win3.1 onto a P200MMX machine(painfully slow by today’s standards) a couple of years back - about 5 seconds to boot from cold and there just wasn’t any way at all to get the hourglass to show.

And while you’re at it, you might also want to learn sed and/or awk. I suspect that the same folks who port grep over to Windows have done ports of these two “general-purpose text filters” (I use the Cygwin versions, personally), and between the two of them, they can do anything conceivable to any file you might care to send through them.

And for those things you can’t do with sed or awk, there is always perl.

There is indeed a file size limitation. Just the other day, I found a Dr. Watson log file weighing in at a little over 1 GB ( :eek: :eek: :eek: ) lying around and wanted to see what was in it. Notepad complained, and Word XP told me its limit was a measly 32 MB.

I wasn’t that interested, so I just deleted it. :slight_smile:

awk at least is a complete programming language, so you can (in principle) do anything in awk. I’ll grant that there are probably things you can do much easier in perl, but I don’t know nearly enough about Perl to say what those things would be.

And filter-type programs like sed really don’t care what the file size is, since they only deal with a small part of the file at a time. If for some reason you wanted to run that 1 gig file through sed, it wouldn’t even flinch (although it might take a while).

Chronos, actually, sed has been proven to be Turing-complete, so you can theoretically do anything in sed, too. :slight_smile: