digitalFAQ.com Forums [Archives]

digitalFAQ.com Forums [Archives] (http://www.digitalfaq.com/archives/)
-   Video Encoding and Conversion (http://www.digitalfaq.com/archives/encode/)
-   -   To crop or not to crop! (http://www.digitalfaq.com/archives/encode/1841-crop-crop.html)

kwag 12-19-2002 03:18 PM

Quote:

Originally Posted by heyitsme
kwag,

I used your new file prediction methods and new technique to encode Moulin Rouge. I encoded it iwth the kvcdx3 template with fluxsmooth only and the mpeg ize came out to be 782 megs. Which is pretty close. Just wanted to let you know. Quality sure blew me off my feet. Nice......

Branden

The 782MB, is that video+audio? or video only :?:

-kwag

heyitsme 12-19-2002 03:19 PM

Video and audio at 112 kps.

kwag 12-19-2002 03:21 PM

Quote:

Originally Posted by heyitsme
Video and audio at 112 kps.

GREAT! :D One more question: Did you use 0.89 as prediction factor ( 11 if using KVCD Predictor ) or did you use another value :?:

-kwag

heyitsme 12-19-2002 03:22 PM

I used the 0.89 prediction factor.

kwag 12-19-2002 03:23 PM

Quote:

Originally Posted by heyitsme
I used the 0.89 prediction factor.

BINGO! I think we're starting to hit the jackpot :lol:

-kwag

Boulder 12-19-2002 03:52 PM

I'll encode "Patton" tomorrow (in Finnish time :wink: ) and see if the Predictor will handle this one, the movie's almost 3hrs long. I'll be using KVCDx2 PLUS at 352x576 and post the results along with the script as soon as it's done.

Edit: and to make it difficult for the Predictor, I'm putting permanent subtitles in the movie. It's going to be interesting to see what happens.

kwag 12-19-2002 03:59 PM

Quote:

Originally Posted by Boulder
I'll encode "Patton" tomorrow (in Finnish time :wink: ) and see if the Predictor will handle this one, the movie's almost 3hrs long. I'll be using KVCDx2 PLUS at 352x576 and post the results along with the script as soon as it's done.

Edit: and to make it difficult for the Predictor, I'm putting permanent subtitles in the movie. It's going to be interesting to see what happens.

8O That's a long time for 352x576 8O. Maybe on the LBR, but again, 352x576 AND subtitles :o
You really want to torture the encoder :lol: . Let us know your results :wink:

-kwag

Boulder 12-19-2002 04:02 PM

Oh, forgot to mention that I plan to put the movie on 2 CDs. I can't use the LBR since my Pioneer doesn't like really low bitrates with VCD headers. If I use 352x288 and burn as SVCD (to allow using bitrates lower than 700), I get some annoying bob'ing in the picture, especially in the subs. If the resolution is n x 576, there's no problem.

kwag 12-19-2002 04:07 PM

Quote:

Originally Posted by Boulder
Oh, forgot to mention that I plan to put the movie on 2 CDs.

Phew!, now we're talking :)
Then why don't just go 704x576 :?: Or x3 :?:
If I put "Red Planet" on a single CD-R at 704x480, then you'll be coding only ~90 minutes per CD-R. So your 3 hour movie should look almost like the original in two CD's at 704x576 or 528x576 :idea:

-kwag

Boulder 12-19-2002 04:11 PM

Quote:

Originally Posted by kwag

Then why don't just go 704x576 :?: Or x3 :?:
If I put "Red Planet" on a single CD-R at 704x480, then you'll be coding only ~90 minutes per CD-R. So your 3 hour movie should look almost like the original in two CD's at 704x576 or 528x576 :idea:

-kwag

Hmm, I'll have to think about that. Sounds interesting, it would be the first time I do any encodes at such large resolutions :twisted:

If I really want to work on this, I'll test the Predictor with both options 8) I might just finish both tomorrow if I start early.

kwag 12-19-2002 04:14 PM

Quote:

Originally Posted by Boulder
Hmm, I'll have to think about that. Sounds interesting, it would be the first time I do any encodes at such large resolutions :twisted:

Make sure your player can handle 704x ( But maybe you already know that :wink: )

-kwag

Jellygoose 12-19-2002 04:28 PM

kwag, why did you change the mode from CQ to CQ_VBR anyway? what's the difference between these two?
Seriously they both look the same, to me, using your Q-Matrix and limit it to a certain file-size...

kwag 12-19-2002 04:34 PM

Quote:

Originally Posted by Jellygoose
kwag, why did you change the mode from CQ to CQ_VBR anyway? what's the difference between these two?
Seriously they both look the same, to me, using your Q-Matrix and limit it to a certain file-size...

CQ_VBR is better than CQ. At least on all tests I made with the Q. matrix. CQ encoding was left behind on the original KVCD's WAY back. CQ also caused some compatibility problems. Don't know why, but many users reported unstable (not smooth ) video with CQ, but not with CQ_VBR. So the bit rate/quality curve created is different. I recall completely different scales viewed with bit rate viewer.

-kwag

SansGrip 12-19-2002 05:15 PM

Quote:

Originally Posted by black prince
Picture quality is fantastic with Blockbuster noise but, file prediction is constantly changing.

Yes, I can see that happening using Blockbuster if you encode the sample strip twice with the same settings you might get different results. But how different? I've not tried it myself.

The reason this occurs is because Blockbuster is adding random noise to the clip. Since it's random, it'll likely get encoded differently each time you run it.

I just added a new parameter for the noise method allowing one to specify a value with which to seed the pseudo-random number generator. This should cause the same "random" noise to be generated each time you run the filter with the same settings. I'm testing it right now and will release later today if it's all working.

SansGrip 12-19-2002 05:17 PM

kwag -

Just so I'm clear, if one increases the error margin in KVCDP from 5 to 11%, the target sample size should decrease, correct?

kwag 12-19-2002 05:44 PM

Quote:

Originally Posted by SansGrip
kwag -

Just so I'm clear, if one increases the error margin in KVCDP from 5 to 11%, the target sample size should decrease, correct?

Now I'm confused here :? With the manual formula, multiplying the final value by .89 causes an increase, because the final file size was smaller if multiplied by .95. So when multiplying by .89, gets a larger file size and closer to the real predicted value. Anyway, I'm getting mixed results here now, after finishing my 352x240. I had a predicted file size of ~720MB, and wound up with 800MB 8O. So this GOP is screwing every calculation. Time to revise the formula. I'm now taking 50 snapshots of 48 frames, instead of 100 of 24 and a factor of 1.0 to remove any factor for the time being. I'm going to find an optimal "width" for the sample window, so we can use no matter what GOP is used. Because it's going to be a circus using different factors for different resolutions. So I'm playing with the amount of frames to take, and the window size. Hopefully I get a happy medium that applies to all, and is consistent no matter what resolution is used. After this is done, then we can apply a 5% or so safety factor to the formula. What do you think :?:

Edit: Or 200 samples of 12 frames each. This will increase the sampling resolution. Must try that too.

-kwag

SansGrip 12-19-2002 07:37 PM

Quote:

Originally Posted by kwag
Now I'm confused here :? With the manual formula, multiplying the final value by .89 causes an increase, because the final file size was smaller if multiplied by .95. So when multiplying by .89, gets a larger file size and closer to the real predicted value.

But following all the confusion a while back we decided that increasing the error margin should decrease the target size, since that's what makes logical sense (bigger margin for error = more room to play with)...?

Quote:

Time to revise the formula.
Funnily enough I just got by far the most accurate results yet by taking 100 snapshots of 48 frames, and fooling KVCDP into working with it by setting the "Sample Points" to 200. We're doing that weird "wavelength" thing again ;).

Predicted size was 812mb with an error margin of 0%, final size is 808mb 8O.

It does take longer to encode the sample strips, but it's surely worth it if it makes the prediction more accurate...

syk2c11 12-19-2002 08:15 PM

Hi Kwag and those experienced users,
Correct me if I'm wrong. The NEW method is as follows:
(1)---Commenting out the bilinear resize and add borders from the .avs script, and calculating the aspect to put it manually in TMPEG. Also, check that your masks are correct as in my second sample, because these will change once you remove the bilinear resize and add borders lines from your .avs script. This is what I did: My movie is 720x480, and my target is 704x480. So I use our friend FitCD, just to find out the values I need for my resize. When I open my .d2v, and I select XVCD (704x480) as destination with 2 blocks overscan, I see that the resize line is this: BilinearResize(672,352,16,0,688,480) so that means that I will use 672x352 in TMPEG under "Video arrange method: (Center custom size)" and use these values.

(2)---Use new template with new GOP (1-12-2-1-24)

(3)---Use predictor factor as 0.89

then we will be able to fit 90 mins into ONE CD-R????

SansGrip 12-19-2002 08:20 PM

Quote:

Originally Posted by syk2c11
(1)---Commenting out the bilinear resize and add borders from the .avs script, and calculating the aspect to put it manually in TMPEG.

I know kwag is the man when it comes to TMPGEnc, but I'm still not convinced that the big savings he saw was the result of using TMPGEnc's internal resize.

Personally I see a size increase when I switch from Avisynth's bilinear to TMPGEnc's internal resize.

I have a feeling the big difference kwag saw was the result of adding borders after Blockbuster. But I could be wrong :).

Quote:

(2)---Use new template with new GOP (1-12-2-1-24)
Yes. This new GOP seems to be very good. kwag's definitely right about that one ;).

Quote:

(3)---Use predictor factor as 0.89
As far as I know this is still up in the air. We'd appreciate it if you could test a few values for us, though -- say, ranging from 0.85 to 1.0... ;)

Quote:

then we will be able to fit 90 mins into ONE CD-R????
Yesterday I put American Pie (1h35m) on one CD at 704x480. Looks awesome.

SansGrip 12-19-2002 08:34 PM

Ok, here's the results of my tests wrt to bp's recent post about Blockbuster and unpredictable file sizes. I encoded a 2059-frame sample three times using exactly the same settings and script. Here's what I got:

1st encoding: 10,555,039 bytes
2nd encoding: 10,554,551 bytes
3rd encoding: 10,554,716 bytes

So it seems Blockbuster does indeed cause some unpredictability. The amount of variation seen will depend on how much noise it's having to add to the source material.

Now here's the same sample encoded with the same settings, the only difference being the use of the new seed parameter:

1st encoding: 10,554,519 bytes
2nd encoding: 10,554,519 bytes
3rd encoding: 10,554,519 bytes

I can think of no reason why this would affect anything in an adverse way, so I'm going to package it up and make a release in a few minutes.


All times are GMT -5. The time now is 06:13 PM  —  vBulletin © Jelsoft Enterprises Ltd

Site design, images and content © 2002-2024 The Digital FAQ, www.digitalFAQ.com
Forum Software by vBulletin · Copyright © 2024 Jelsoft Enterprises Ltd.