10-07-2003, 05:03 PM
|
Free Member
|
|
Join Date: May 2003
Posts: 10,463
Thanks: 0
Thanked 0 Times in 0 Posts
|
|
Okay, I think is time to tell you about the last things I did on script for Divx.
The following script give purely fantastic results. The only counterpart is that it is quite slow (but not so much : I encode a 0.25x on my 1.3Ghz P4).
You can remove the second blockbuster if your original source isn't too much blocky to gain a little encoding time.
Code:
AviSource("PATH\NAME.avi",false)
BlindPP(cpu=4)
Blockbuster(method="noise",detail_min=1,detail_max=5,variance=0.1,seed=5823)
VagueDenoiser(threshold=1.5,method=1,nsteps=6,chroma=true)
GripCrop(WIDTH, HEIGHT, overscan=1, source_anamorphic=false)
GripSize(resizer="LanczosResize")
Undot()
TemporalSoften(2,7,7,3,2)
DCTFilter(1,1,1,1,1,1,0.5,0)
Blockbuster(method="noise",detail_min=1,detail_max=10,variance=0.3,seed=5623)
GripBorders()
Note1 : be carreful to have the last version of vaguedenoiser ( minimum 0.28 ). See http://kurosu.inforezo.org/avs/VagueDenoiser/index.html
Note2 : if vaguedenoiser complains taht your source dimensions are too small to make a step 6 denoising, just reduce the value of "nsteps" parameter).
|
Someday, 12:01 PM
|
|
Site Staff / Ad Manager
|
|
Join Date: Dec 2002
Posts: 42
Thanks: ∞
Thanked 42 Times in 42 Posts
|
|
|
10-09-2003, 11:56 AM
|
Free Member
|
|
Join Date: Jul 2003
Location: Bretagne
Posts: 17
Thanks: 0
Thanked 0 Times in 0 Posts
|
|
Hi
For the vaguedenoiser part, Have you also tried method=2 ? It may keep more details, and you'll be able to use a little higher threshold
|
10-09-2003, 03:51 PM
|
Free Member
|
|
Join Date: May 2003
Posts: 10,463
Thanks: 0
Thanked 0 Times in 0 Posts
|
|
Lol.
I used 0, 1 and 3 but didn't try with method=2 .
I will. But doesn't it affect speed ? For the threshold, I found in debug mode that 1.5 is "far enougth but not too much".
|
10-09-2003, 04:16 PM
|
Free Member
|
|
Join Date: Sep 2003
Posts: 21
Thanks: 0
Thanked 0 Times in 0 Posts
|
|
First off, you said not to get anything below 0.28 for VagueDenoiser, but the site you gave only gives links for 0.27.1 as the highest. and second, what are ther average times for encoding as opposed to your other optimal scripts?
|
10-09-2003, 04:29 PM
|
Free Member
|
|
Join Date: May 2003
Posts: 10,463
Thanks: 0
Thanked 0 Times in 0 Posts
|
|
Quote:
Originally Posted by Zolie
First off, you said not to get anything below 0.28 for VagueDenoiser, but the site you gave only gives links for 0.27.1 as the highest. and second, what are ther average times for encoding as opposed to your other optimal scripts?
|
There is an error in the text of the page : you can see that there are two 0.27.1 links, but the second one actually DL 0.28
The time is 25% more that a script using deen (that is not very fast itself).
A movie encoded in 7 hours on my PC was done in 9 hours with vaguedenoiser.
|
10-09-2003, 08:29 PM
|
Free Member
|
|
Join Date: Sep 2003
Posts: 21
Thanks: 0
Thanked 0 Times in 0 Posts
|
|
is the difference really worth the 2 extra hours, or is it just a bit better. Oh and how much larger are the files, or smaller?
|
10-09-2003, 11:58 PM
|
Free Member
|
|
Join Date: Jul 2003
Location: Pazardjik, Bulgaria
Posts: 147
Thanks: 0
Thanked 0 Times in 0 Posts
|
|
This script is really slow for me, I will see later what quality does it produce.
|
10-10-2003, 05:13 AM
|
Free Member
|
|
Join Date: May 2003
Location: Germany
Posts: 3,189
Thanks: 0
Thanked 0 Times in 0 Posts
|
|
Hi Dialhot,
Well Im trying a lot with Vague Denoiser on also other projects.
It seems that it works a bit heavy by "plaining" a lot of details for example in skins or hair-structures.
But in combination with TemporalSoften(2, .......) it deletes a lot ?
|
10-10-2003, 06:41 AM
|
Free Member
|
|
Join Date: May 2003
Location: Antwerp, Belgium
Posts: 43
Thanks: 0
Thanked 0 Times in 0 Posts
|
|
i too think that it softens the picture too much. the previous script kept detail better imho.
|
10-10-2003, 07:42 AM
|
Free Member
|
|
Join Date: May 2003
Location: Germany
Posts: 3,189
Thanks: 0
Thanked 0 Times in 0 Posts
|
|
I think we have to keep in mind that every Dvix/Xvid Source got its difference. So this seems to be a "general" Script.
If I got for example a very good encoded 1400MB Dvix of a movie I don't have to clean it that much and on the other hand on already bad encoded Dvix's, more cleaning makes big sense
|
10-11-2003, 03:43 AM
|
Free Member
|
|
Join Date: May 2003
Location: Antwerp, Belgium
Posts: 43
Thanks: 0
Thanked 0 Times in 0 Posts
|
|
so that would result in changing which values to what?
|
10-11-2003, 06:08 AM
|
Free Member
|
|
Join Date: Jul 2003
Location: Pazardjik, Bulgaria
Posts: 147
Thanks: 0
Thanked 0 Times in 0 Posts
|
|
You can use another script for clean sources
|
10-11-2003, 02:09 PM
|
Free Member
|
|
Join Date: May 2003
Posts: 10,463
Thanks: 0
Thanked 0 Times in 0 Posts
|
|
Hi all,
Not a lot of time these days so I can't answer to everything and everyone. It seems that the main idea of all your test on this new script is that is to heavy on details removing.
Well. In fact I focused myself on block removals (that is the goal of a script for avi after all) and perhaps did not do enought comparison with previous script on this point (detail level).
I'm quite sure the problem isn't in vaguedenoiser but in temporalsoften. I add this line a long time after all my tests just to have a encoded size equivalent to the previous script. Perhaps the valus are to strong. Try to check this id you have time. I plan to make test using ATC insteed of temporalsoften. Sure it will be lighter (and faster).
|
All times are GMT -5. The time now is 06:56 PM — vBulletin © Jelsoft Enterprises Ltd
|