I have Adobe Acrobat 7.0 pro installed on my machine and whenever I load a large document to work on, it is SLOW, and my cpu works overtime according to my task manager. When I try to do something in the document, cpu % is typically 80-90% for acrobat. Sometimes, during navigation through the document, I have to wait a long time for my screen to refresh (10-30 seconds) especially if I’m moving too fast. The computer gets “stuck” and I am unable to do anything within the document. The document is on my local hard drive. I have a Dell workstation of recent vintage.
I’ve upgraded my system RAM to 1GB and I still have the same problem. I have a 5 year old video card in my system with 8 MB of memory but the tech support at work can’t tell me if that has anything to do with my Adobe problems.
Here are my questions.
How does a video card affect my computer performance? What problems would I see with applications in general if I don’t have a good video card and/or video RAM? I haven’t found a resource that explains this in a satisfactory way.
Does the cpu and system RAM take over when the video card is lacking? What happens if the video resources are not sufficient?
Please remember this is Adobe Acrobat and not some 3D game I’m talking about. My pdf documents have lots of graphics.
Hard to say without more details on your system but in general the stuff Dell and others sell at low to moderate prices seriously lack in the video card department. High end video cards can often cost more than some low end computers. It makes sense that to keep the price point down the video card frequently gets skimped on.
Video cards use something called a “frame buffer” to draw what is on the screen at any given moment. Stuff off the screen (e.g. text further down the page) is not included in there. When you want to scroll down the computer needs to access memory to see what is supposed to be there and move it to the frame buffer.
The problem here comes in the bewildering array of memory technologies and configurations video cards can use. Many built in graphics chips (i.e. on the motherboard) share system memory with the computer. Others can use EDO Ram, VRAM, WRAM and some others. The lower end the graphics card the more they will skimp on the video memory.
For instance, VRAM and WRAM are dual ported. This means that the memory can be read and written to simultaneously for a dramatic increase in video performance. When scrolling down the page the RAMDAC (gizmo that converts your digital computer signal to an analog signal your monitor can use) needs to access the memory many times per second to read what it is supposed to be displayed. If at the same time the computer is trying to udate the memory with new data they conflict and you get pauses on your screen or that sort of choppy rather than smooth scrolling.
Add to that bus speeds (the path the data travels through the system). Not all bus speeds are created equal so if you have a built in video card using system memory and mostly relying on the processor to do calculations and transferring data over a normal PCI bus your video performance is going to suck. All sorts of things are clamoring for shared system resources and the result is pauses and hiccups.
Finally, a lot has to do with the program itself and I do not know enough about Adobe to say how it works. Many 3D games will note a high end graphics card and offload as much of the number crunching as it can to the video board. If that is not there then the rest of the system has to pick up the slack. My guess is Adobe does not do this and likes to gobble up system resources. I do not know if a high end graphics card with a lot of its own RAM will be smart enough to grab data (e.g. the rest of the document even though not displayed yet) and move it to its own RAM. If it does this would be ideal since high end video cards are very adept at shovelling the video data to itself (whether the card can be smart and do this or it requires the people who wrote the program to tell it that it can do that I do not know).
Long story short, yes, a high end graphics card can improve the issues you face assuming your system can take one (i.e. has a PCI-X slot or AGP slot). Before you buy a new one however (assuming you want to) I’d take a sample file to the store and have them load it up on a system there to see how well a given system performs. It may not be a fair test since your computer overall is liekly different than the one you are testing on (CPU speed, motherboard, RAM, etc.) but at least it can give you a sense of what is possible.
It’s not unusual for the foreground application (the one you’re working in) to get 80+ percent of the CPU while you’re working in it; Windows can (and does by default) prioritize that application over others.
However: If you can get that level of CPU use, you’re almost certainly not being limited by your video card or system memory. Whack-a-mole is right in his answers to your questions, but I strongly doubt that a faster video card will make any difference for this app. If it were waiting for either memory or video card, the CPU would go idle while it waited, and your CPU time percentage would go down (probably way down). Plus, you’ve got a gigabyte of memory – unless your documents are HUGE, that ought to be enough.
Any video card made in the last 5 years should be more than adequate for 2D work, especially on an LCD monitor – the only cases where video card should matter are 3D (where it matters a lot), huge monitor surfaces (because of the sheer number of pixels involved), or the VRAM/Main memory contention; which is usually limited to cheap PC’s and many laptops.
I think you’re just plain compute bound – if you want this app to run faster, you’re going to need a faster CPU (which may or may not be an easy upgrade). Alternatively, when you’re not editing, there are non-Adobe PDF viewers out there that are faster at display.
For the folks responding, please note that he’s using Acrobat Pro, not Acrobat reader. I don’t know if y’all are getting them confused, but many people do.
One other thing to try is to delete everything in your Windows temporary directory: use Start -> Run to go to %TEMP%, delete everything (if that directory exists), then repeat with %TMP%. That fixes an enormous number of issues with some Adobe products.
Also, be aware that Acrobat is probably paging on those long documents, so be sure that you’ve got plenty of available, unfragmented disk space.
I agree with TimeWinder that defragmenting your hard drive is a good bet. Increasing your page file might help also, with 1Gb ram you should have at least a 1Gb windows pagefile. OTOH, I would definitely not delete any files in TMP unless you know what you’re doing (or you’re still on win98). You should however empty your Internet Explorer cache (from witin IE) before you defragment.
It’s difficult to say whether a new CPU will improve your situation since you didn’t mention what you currently have. But as mentioned by others, it’s normal for the foreground application to use 80-90% of CPU resources. 1gb ram is also good enough for this kind of work.
I new graphic card may or may not help, it depends on whether the graphics are vector based (heavy load) or bitmap (easy load). There are a few options in the program for viewing graphics though, maybe you should first try them out. There’s also an option called “Use page cache” somewhere (it’s in Reader 6.0 at least) which supposedly will speed up page display.
Why? The files in TMP are, by definition, temporary. There are numerous applications out there that delete things willy-nilly from this folder at every boot. The purpose of that folder is to hold files that applications don’t need kept around. If an application is currently using a file there, it will be locked and you won’t be able to delete it anyway. It’s possible that an application might have a temporary file that’s not actually open, but it intends to use again later in the run, but: a) in decades of doing this, it’s never happened to me, and b) applications are supposed to be able to handle the deletion of such files “behind their back.”
Unfortunately, cleaning up the temporary items folder fixes so many speed issues with systems that it’s too valuable a technique to pass up: I actually believe Windows would be better off trashing everything in that folder on every boot, like older versions of Macintosh used to.
It just so happens that programs leave files in TEMP witch are needed for later. If you enable the Modified, Created and Accessed columns in Explorer, you’ll see files which are accessed, even modified, after creation. Deleting TMP files shouldn’t pose a problem (replicate on need), but I’ve seen computers go down because of this (on w2k & xp). The lesson is: You should never trust a computer programmer. I have examples.
The problem is, IMO, limited, and confined to minor fragmentation issues (as opposed to the flawed pre-w2k model which could leave thousands of empty or unused files). Personally, to avoid fragmentation from temp files affecting the system, I separate the User TEMP variable and the System TEMP variable; keeping the former on a separate volume along with program cache files (e.g. Internet Explorer), the latter on C: with the system page file (fixed size), and I have a third volum for programs and a fourth for data storage.