Quote:
-kwag |
Video and audio at 112 kps.
|
Quote:
-kwag |
I used the 0.89 prediction factor.
|
Quote:
-kwag |
I'll encode "Patton" tomorrow (in Finnish time :wink: ) and see if the Predictor will handle this one, the movie's almost 3hrs long. I'll be using KVCDx2 PLUS at 352x576 and post the results along with the script as soon as it's done.
Edit: and to make it difficult for the Predictor, I'm putting permanent subtitles in the movie. It's going to be interesting to see what happens. |
Quote:
You really want to torture the encoder :lol: . Let us know your results :wink: -kwag |
Oh, forgot to mention that I plan to put the movie on 2 CDs. I can't use the LBR since my Pioneer doesn't like really low bitrates with VCD headers. If I use 352x288 and burn as SVCD (to allow using bitrates lower than 700), I get some annoying bob'ing in the picture, especially in the subs. If the resolution is n x 576, there's no problem.
|
Quote:
Then why don't just go 704x576 :?: Or x3 :?: If I put "Red Planet" on a single CD-R at 704x480, then you'll be coding only ~90 minutes per CD-R. So your 3 hour movie should look almost like the original in two CD's at 704x576 or 528x576 :idea: -kwag |
Quote:
If I really want to work on this, I'll test the Predictor with both options 8) I might just finish both tomorrow if I start early. |
Quote:
-kwag |
kwag, why did you change the mode from CQ to CQ_VBR anyway? what's the difference between these two?
Seriously they both look the same, to me, using your Q-Matrix and limit it to a certain file-size... |
Quote:
-kwag |
Quote:
The reason this occurs is because Blockbuster is adding random noise to the clip. Since it's random, it'll likely get encoded differently each time you run it. I just added a new parameter for the noise method allowing one to specify a value with which to seed the pseudo-random number generator. This should cause the same "random" noise to be generated each time you run the filter with the same settings. I'm testing it right now and will release later today if it's all working. |
kwag -
Just so I'm clear, if one increases the error margin in KVCDP from 5 to 11%, the target sample size should decrease, correct? |
Quote:
Edit: Or 200 samples of 12 frames each. This will increase the sampling resolution. Must try that too. -kwag |
Quote:
Quote:
Predicted size was 812mb with an error margin of 0%, final size is 808mb 8O. It does take longer to encode the sample strips, but it's surely worth it if it makes the prediction more accurate... |
Hi Kwag and those experienced users,
Correct me if I'm wrong. The NEW method is as follows: (1)---Commenting out the bilinear resize and add borders from the .avs script, and calculating the aspect to put it manually in TMPEG. Also, check that your masks are correct as in my second sample, because these will change once you remove the bilinear resize and add borders lines from your .avs script. This is what I did: My movie is 720x480, and my target is 704x480. So I use our friend FitCD, just to find out the values I need for my resize. When I open my .d2v, and I select XVCD (704x480) as destination with 2 blocks overscan, I see that the resize line is this: BilinearResize(672,352,16,0,688,480) so that means that I will use 672x352 in TMPEG under "Video arrange method: (Center custom size)" and use these values. (2)---Use new template with new GOP (1-12-2-1-24) (3)---Use predictor factor as 0.89 then we will be able to fit 90 mins into ONE CD-R???? |
Quote:
Personally I see a size increase when I switch from Avisynth's bilinear to TMPGEnc's internal resize. I have a feeling the big difference kwag saw was the result of adding borders after Blockbuster. But I could be wrong :). Quote:
Quote:
Quote:
|
Ok, here's the results of my tests wrt to bp's recent post about Blockbuster and unpredictable file sizes. I encoded a 2059-frame sample three times using exactly the same settings and script. Here's what I got:
1st encoding: 10,555,039 bytes 2nd encoding: 10,554,551 bytes 3rd encoding: 10,554,716 bytes So it seems Blockbuster does indeed cause some unpredictability. The amount of variation seen will depend on how much noise it's having to add to the source material. Now here's the same sample encoded with the same settings, the only difference being the use of the new seed parameter: 1st encoding: 10,554,519 bytes 2nd encoding: 10,554,519 bytes 3rd encoding: 10,554,519 bytes I can think of no reason why this would affect anything in an adverse way, so I'm going to package it up and make a release in a few minutes. |
I'm getting ready to encode Pearl Harbor to one cd with the LBR temp and new GOP. I encoded a test sample with no filters and the Gibbs effect was very bad- so now I'm trying out SansGrips filters for the first time! :D
I used this script: Code:
LoadPlugin("C:\encoding\MPEG2DEC2.dll") btw- excellent documentation with your filters :) thanks. |
Quote:
Quote:
I generally use: Code:
BlahSource("foo.bar") As for sharpening, I generally don't do it. In my experience it increases the file size without any real benefit. If I think the movie could do with a little sharpening I'll usually switch to bicubic or Lanczos resizing, which result in a sharper image than bilinear. Quote:
|
Quote:
Quote:
Quote:
-kwag |
@SansGrip and Kwag,
It would be nice to resolve the question of Avisynth vs Tmpgenc resize effecting file size. Encoding with Tmpgenc's resize is Sloooooow!! :cry: I even wonder if using Tmpgenc's mask for borders couldn't be corrected from avs script. If crop is cutting borders top and left then the leakage has to be bottom and right that Tmpgenc's mask is ignoring for file size savings. :?: Is there a filter to mask borders the same as Tmpgenc :?: -black prince |
Quote:
|
Update flash :wink:
Just finished my 352x240 LBR MIN=300Kbps, MAX=1,800Kbps test with "Bug's life" for the second time today. Instead of using 100 samples at 24 frames each, I used 50 samples of 48 samples each. The file size prediction came out as follows: Wanted file size: 725,112KB. Final file size: 727,262K :mrgreen: That was with prediction factor at 1.0, so for safety, I'll set that now to 0.98, and that should be good enough. The previous encode with 100 samples of 24 frames each, came out to ~800MB 8O Now I'm encoding "Red Planet" at 704x480 with the same formula adjustments and Blockbuster noise filter. Four hours to go..... :wink: -kwag |
@SansGrip,
I believe this would mean a value of 2 instead of 5 in KVCD Predictor :?: -kwag |
Quote:
Code:
IL = Framecount / 50 @kwag I'll try 50/48 in a moment, but I'd be interested to hear if you get even more accurate results with 100 samples of 48 frames (Sample Points set to 200, if you're using KVCDP), or whether 50 is indeed enough. Care to run another encode? :mrgreen: |
Quote:
While the number of sample points is of course important, all it's really doing is giving us more representative material to work with than simply taking 2400 frames from the beginning of the movie. I think with almost all movies it should be adequate to sample 50 times. |
Hey Kwag,
I hope your using the latest Blockbuster 0.6 with seed parm: LoadPlugin("E:\DVD Backup\2 - DVD2SVCD\MPEG2DEC\MPEG2DEC.dll") LoadPlugin("E:\DVD Backup\2 - DVD2SVCD\BlockBuster\BlockBuster.dll") LoadPlugin("E:\DVD Backup\2 - DVD2SVCD\LegalClip\LegalClip.dll") mpeg2source("D:\Temp\movie.d2v") LegalClip() LanczosResize(496,448) Blockbuster( method="noise", detail_min=1, detail_max=10, variance=1, seed=1 ) AddBorders(16,16,16,16) LegalClip() Seed uses the same frequency so that file size won't vary when using noise. 8) Just encoded Minority Report using 528x480, CQ_VBR=11, audio 128kb on 2 CD's (800mb each). 1.4 GB. The quality is the best I've seen. Even Gibbs effect had almost vanished. Fast action scenes were near perfect. This is by far the best picture quality yet. The DVD and the backup look the same. Since I could have used a higher CQ_VBR to fill up the CD's, I'm going to wait until the file prediction formula is finalized :D -black prince |
Quote:
Right now I'm seeing if I can get Untouchables (1h59m) on one disc at 528x480. Maybe I'm dreaming, or maybe not..... :D |
Quote:
-kwag |
Quote:
-kwag |
Quote:
9.25 (by the way, I added a decimal place in the encoding helper ;)) is a touch on the low side, but the samples looked pretty good to me. If the Gibbs is too noticible on the TV I'll reencode at 352x480 with the new GOP and formula. I'm beginning to think that with the various filters you've crafted there are very few movies we can't get on one disc... 8O Quote:
|
Quote:
Slightly less typing ;). Edit: Heck, you could even use Blockbuster(method="noise", variance=.5, seed=5823) :mrgreen: |
Quote:
-kwag |
Quote:
I've been doing that (100 snapshots of 2 seconds) for some time now. My DVD doesn’t like mpeg1 vbr with resolutions above 352x240, so if I want anything else I have to use mpeg2. And, I don't know why, I couldn’t get accurate prediction with the default scrip (and mpeg2), so I tried exactly what you said and it gave me good accuracy. I never tried 50 snapshots... but maybe it work good to. |
Quote:
Let's see... "SansGrip" is 53616E7347726970 in hex, which in decimal is 6,008,204,819,287,927,152. Nope, wouldn't fit ;). |
Quote:
|
Quote:
-kwag |
Site design, images and content © 2002-2024 The Digital FAQ, www.digitalFAQ.com
Forum Software by vBulletin · Copyright © 2024 Jelsoft Enterprises Ltd.