I've got a powerful system. I use a P4 1.8 with 512 RDRAM. The RDRAM is quite a contributor to the scenario, with 800mhz RAM and 800mhz FSB. So my usage does not always compare to others, especially where VBR and encoded video is concerned. It may not be the newest P4, but most of them are comparable, even HT ones.
Your P4 2.4 may come close though to what I experience.
Now, I ran tests for the various GOP structures.
FYI, ATI MMC closes the GOPs for you by default. So the last 'B' range is missing, which is fine. I close always anyhow in any encoder.
The P length is determinate on what I see. I saw more noise on cartoons, especially when I began to use 3.42 VBR at 352x480. Lowering the range put in less compression artifacts.
It's all about temporal compression. With higher bitrates, the radius CAN be increased. Should it? Well, test.
For the record, such noise is much less apparent on live action. Toons have that large area of like info. You can just tell when it goes bad.
However, inversely, I found when using higher bitrates, like 8.0 and 7.0 for my 720x480 captures, I saw the files getting a bit larger than needed, as well as noise being created that would disappear by expanding to the default delta pattern (1-4-2).
Why does this happen? Well, I have info, but nothing I can put into coherent sentences at this time. It's part of the lovely tech of MPEG. Other have tried to explain it, but I felt their explanations were falling short of making coherent sense. For now, I find myself giving the answer that my parents gave me as a little kids: "because I said so". Carlinism need not apply, as it's not a bogus reason, just one I'm incapable of stating at this time.
This response is already pretty rambly.