I answered you in two other threads : THRE IS
NOTHING TO CALCULATGE MIN AND MAX !
The min must be set as low as possible according to what your
STANDALONE can read. Nothing can calculate that

Only tests and experiment. This minimum is fixed and don't have to be changed for each source.
For instance my player read very well 64 Kbps so I encode everythging with this minium.
The maximum can't be calculate either but it's just a matter of habit : it depends only on the length of the movie you have.
The greater value is 2500 and you can use it for a 1h30 movie on a CD80.
For a 1h45 movie, use 2300.
For a 2h00 movie, use 2000.
For something above, use 1800.
You just have to understand that the longer is the movie, the shorter must be the bitrate range (MAX-MIN) to obtain decent average quality. But fur sur, the lower is the MAX, the worst is the quality of
action scenes.
When I do a TV show with no action scenes (friends) I even use 1300 as maximum and I fit 2h40 on a CD with no problem !
As I told you : all is matter of habit and
experiment. You say you are going nowhere ? That's not true : you are taking experience !
Note: for your red boxes I as explained to you there is NOTHING that can gives you that just because you changed the bitrate so somethign is definitely screwed on your PC :!!
Note2: sound desynh has also NOTHING to do with the video bitrate so PLEASE stop to mix up the two problems

Do one thing at a time and DON'T encode the video and the audio together