Forum Why is VHS stored Aspect Ratio different from Display Aspect Ratio?
 Ask Question Join / Register FAQ Search Today's Posts Mark Forums Read

#1
02-03-2023, 07:22 PM
 Duxa Free Member Join Date: Jan 2023 Posts: 18 Thanked 2 Times in 2 Posts
Just curious why does VHS store data in different aspect than 4:3 but plays in 4:3? It seems like stored aspect ratio is 3:2 which is (1.5) half way to 16:9 (1.7) modern AR than 4:3 (1.3). Why werent TVs 3:2? Does it have to do with legacy television? Was it originally broadcast at 4:3? So for compatibility VHS displayed at 4:3? Why store it at 3:2 then?
Someday, 12:01 PM
 Ads / Sponsors Join Date: ∞ Posts: 42 Thanks: ∞ Thanked 42 Times in 42 Posts
#2
02-03-2023, 11:53 PM
 latreche34 Free Member Join Date: Dec 2015 Location: USA Posts: 2,649 Thanked 465 Times in 431 Posts
When Sony made their first D1 digital tape recorder that can record analog video into digital format from cameras and telecine machines it was based on a standard called Rec.601, Back then there was no such thing called pixel, The frame was defined by active scan lines (486 for NTSC) and each scan line is defined by its frequency of change or lumination change (luma), When they drafted the analog to digital conversion a sampling rate is assigned based on the highest frequency (pro analog formats usually, VHS barely resolves 200 luma changes) and some math, It came up to about 704 give or take. But knowing that not all cameras and telecine machines have their imaging frame in the center, they decided to add 16 samples to the total sampling, 8 samples on each side so nothing can be missed, so 720 it is. 720 analog samples give 720 pixels in digital but only 704 are the actual frame.

To understand the scan line samples, imagine a scan line is comprised of alternate white and black dots, The highest number of them that can fit in a single scan line without blending into each other and become a blurry gray line is the scan line highest sampling rate, That came to 704, meaning 352 black dots and 352 white dots. If they had to start from 486/480 and multiply that by 4/3, they will have only 648/640 samples, too little to define a NTSC scan line, so they opted for non square pixel, 704x486 pro devices, 704x480 consumer devices.

Last edited by latreche34; 02-04-2023 at 12:05 AM.
#3
02-04-2023, 12:29 AM
 Duxa Free Member Join Date: Jan 2023 Posts: 18 Thanked 2 Times in 2 Posts
Quote:
 Originally Posted by latreche34 When Sony made their first D1 digital tape recorder that can record analog video into digital format from cameras and telecine machines it was based on a standard called Rec.601, Back then there was no such thing called pixel, The frame was defined by active scan lines (486 for NTSC) and each scan line is defined by its frequency of change or lumination change (luma), When they drafted the analog to digital conversion a sampling rate is assigned based on the highest frequency (pro analog formats usually, VHS barely resolves 200 luma changes) and some math, It came up to about 704 give or take. But knowing that not all cameras and telecine machines have their imaging frame in the center, they decided to add 16 samples to the total sampling, 8 samples on each side so nothing can be missed, so 720 it is. 720 analog samples give 720 pixels in digital but only 704 are the actual frame. To understand the scan line samples, imagine a scan line is comprised of alternate white and black dots, The highest number of them that can fit in a single scan line without blending into each other and become a blurry gray line is the scan line highest sampling rate, That came to 704, meaning 352 black dots and 352 white dots. If they had to start from 486/480 and multiply that by 4/3, they will have only 648/640 samples, too little to define a NTSC scan line, so they opted for non square pixel, 704x486 pro devices, 704x480 consumer devices.
wow that makes a lot of sense. Pretty interesting! Thanks!
#4
02-04-2023, 07:02 AM
 timtape Free Member Join Date: Sep 2020 Location: Perth, Western Australia Posts: 320 Thanked 68 Times in 61 Posts
Quote:
 Originally Posted by Duxa Why werent TVs 3:2?Does it have to do with legacy television? Was it originally broadcast at 4:3? So for compatibility VHS displayed at 4:3? Why store it at 3:2 then?
4:3 was a standard in professional and amateur movie making for a very long time so no surprise television copied it for compatibility. Interestingly when the new TV medium came along it started to threaten the income of the movie industry which tried to attract people back to the cinemas via among other things impressive widescreen formats.

In a way the aspect ratio is more to do with the framing of the scene being captured by the camera, and of course presenting it in that same aspect ratio so recognizable objects like faces look real, even though what goes on in the processing in between shooting and final presentation might depart from that.

It seems today there's an obsession with "updating" anything in 4:3 to presentation in 16:9 even if it means chopping off the top and bottom sections of the frame, or stretching and squeezing the scene to fit the wider frame. We see in many cheaper documentaries a close up of a face where the forehead and chin no longer appear, or if they do the face has grown very fat...
#5
02-04-2023, 07:27 AM
 dpalomaki Free Member Join Date: Feb 2014 Location: VA Posts: 1,588 Thanked 348 Times in 304 Posts
To add a bit, decisions were based in large part on the available technologies of the time. More resolution cost more money, and there was no point in going beyond what most people could see with their eyes.

For analog TV (and VHS) the original 4x3 aspect was driven by movie film. VHS has no pixels, only scan lines (486 for NTSC) and scan rates (~53 microseconds for the active image portion of a scan line) plus some overhead for retrace and frame sync. (Although widescreen movies were developed on the 1920s they did not become common until the 1950s, well after NTSC TV was widespread. Making a wide screen picture tube would not have been economical. Recall the picture tubes were circular and masked to 4x3.)

The nominal 720x480 digital SD was designed to capture the NTSC broadcast image that had a 4.5 mHz bandwidth with a frame rate of 30/sec. The 480 corresponds to the 486 active picture lines rounded down to an even digital multiple of 16. The lost lines are in the analog TV over scan area not displayed by TV sets so it was no practical loss and saved some bandwidth. 720 was chosen because it could accurately represent the active portion of the scan line of an SD video signal. and probably because by using non-square pixels would work for both NTSC and PAL.

It is all magic.

Last edited by dpalomaki; 02-04-2023 at 07:50 AM.
#6
02-04-2023, 08:41 AM
 latreche34 Free Member Join Date: Dec 2015 Location: USA Posts: 2,649 Thanked 465 Times in 431 Posts
PAL at 576 x 4/3= 768 would have been better and gives a perfect 4:3 frame with square pixels, But the standard called for 720, since USA and Japan were the lead of developing standards in video production back then represented by RCA and Sony, In PAL lands have no choice but to follow the west steps.

France went in developing their own analog TV standard after WW2 based on 819 scan lines, But it didn't make it to tape formats and certainly not into digital, so they settled with SECAM for color but luma wise and frame rate is identical to PAL.