Quantcast Linux: Building Encoding Cluster - Page 2 - digitalFAQ.com Forums [Archives]
  #21  
04-15-2004, 10:29 PM
ak47 ak47 is offline
Free Member
 
Join Date: Oct 2002
Posts: 168
Thanks: 0
Thanked 0 Times in 0 Posts
don't give up
http://howto.ipng.be/openMosixWiki/i...ork%20smoothly
__________________
Later ak
Reply With Quote
Someday, 12:01 PM
admin's Avatar
Site Staff / Ad Manager
 
Join Date: Dec 2002
Posts: 42
Thanks: ∞
Thanked 42 Times in 42 Posts
  #22  
04-16-2004, 05:25 AM
venkatk venkatk is offline
Free Member
 
Join Date: Jan 2004
Location: Planet Earth
Posts: 26
Thanks: 0
Thanked 0 Times in 0 Posts
Quote:
Originally Posted by ak47
As Mencoder does all filtering and encoding in one single process, it can't be migrated to cluster nodes. I am exploring other encoders which can work in cluster. Right now I am evaluating Transcode and FFMPEG.

I am not backing off from setting up cluster. I was waiting for motherboards and RAM from two cluster nodes. I have already received the motherboards. Over here motherboards like Intel 865GBFL are not stocked well as the volumes are not there.
__________________
venkat
Reply With Quote
  #23  
04-19-2004, 12:04 AM
venkatk venkatk is offline
Free Member
 
Join Date: Jan 2004
Location: Planet Earth
Posts: 26
Thanks: 0
Thanked 0 Times in 0 Posts
Hi,

I have one quick question?

Can we encode with "Mencoder" in multiple chunks and later merge to one single m2v?. I think this is the only way you can use "Mencoder" in a "Open Mosix" cluster environment.
__________________
venkat
Reply With Quote
  #24  
04-19-2004, 12:27 AM
kwag kwag is offline
Free Member
 
Join Date: Apr 2002
Location: Puerto Rico, USA
Posts: 13,537
Thanks: 0
Thanked 0 Times in 0 Posts
Quote:
Originally Posted by venkatk
Hi,

I have one quick question?

Can we encode with "Mencoder" in multiple chunks and later merge to one single m2v?. I think this is the only way you can use "Mencoder" in a "Open Mosix" cluster environment.
Yes, but you should keep some things in mind.
If you plan to encode a set of ranges on each CPU, you should consider some things, like scene detection etc. So that the last frame per segment, is the last scene just before a scene change, and the next frame is another completely different scene. That should be the start frame for another CPU. This way, you'll maximize compression per set.
Hope you understand what I mean

-kwag
Reply With Quote
  #25  
04-19-2004, 12:43 AM
venkatk venkatk is offline
Free Member
 
Join Date: Jan 2004
Location: Planet Earth
Posts: 26
Thanks: 0
Thanked 0 Times in 0 Posts
Quote:
Originally Posted by kwag
Yes, but you should keep some things in mind.
If you plan to encode a set of ranges on each CPU, you should consider some things, like scene detection etc. So that the last frame per segment, is the last scene just before a scene change, and the next frame is another completely different scene. That should be the start frame for another CPU. This way, you'll maximize compression per set.
Hope you understand what I mean

-kwag
Of course, I knew there are some issues involved. But, I wanted to hear from experts.

What if we split the movie into different chunks based on chapters?

Here is my plan. Write a small script which will take the VOB(s) and chapter list as arguments and spawn multiple encodes depending on number of nodes available. Systems like "Condor" can do the scheduling for you
__________________
venkat
Reply With Quote
Reply




Similar Threads
Thread Thread Starter Forum Replies Last Post
64k NTFS cluster size worth it for video processing? cweb Computers 5 05-03-2006 08:46 AM
EMPIRE STATE BUILDING glänzend Off-topic Lounge 0 10-01-2004 12:23 PM
Linux: Can desktop environments effect your encoding? ak47 Computers 1 01-28-2004 05:56 PM
Building a home entertainment Computer m0rdant Players, DVRs, Media Centers 1 01-02-2004 06:04 PM
Linux: Redhat Linux is dead, long live Fedora Core 1 jorel Computers 1 09-28-2003 04:05 PM

Thread Tools



 
All times are GMT -5. The time now is 03:44 AM  —  vBulletin © Jelsoft Enterprises Ltd