Resolution...
It's resolution, not bitrate, that separates.
Carefully review these measurements, and compare to one another:
- Low resolution SD = 352x240 and below (anything below 352x480, actually, with 352x240 / 320x240 being common)
- Medium resolution SD = 352x480 / 480x480 ~ 512x384
- "Full" resolution SD = 720x480 ~ 704x480 / 640x480
- Standard HD = 720p a.k.a. 1280x720 / 1080i a.k.a. 1920x540 (active) ~ or effectively half of 1920x1080
- "Full" HD = 1080p a.k.a. 1920x1080
- In an analog>digital equiv, "500 lines" analog = 720x480 digital
- VHS "240 lines" = ~352x480 / 335x480
- 8mm "220-240 lines" = ~352x480, but practically lower due to heavy grain of the format
- S-VHS / Hi8 "400 lines" = ~480x480 / 500x480
- There is no analog for HD, it's all digitally measured.
Most people online entirely screw up lines of resolution, converting analog to digital equivalents. This includes many hackjob services out there -- people who don't understand video very well. Most fail to properly comprehend the HD resolutions, too.
Bitrates...
Bitrates simply control the quality of the image. Bitrates needed vary from format to format, resolution to resolution. Not enough bitrate leads to blocks in MPEG, or soft fuzzy image (with some noise) in H.264. Too much just bloats the file real big, making it take more disc space (or more overhead and bandwidth in the broadcast spectrum).
For MPEG-2 Blu-ray, you're looking at 25Mbps versus the 8Mbps often found on DVDs. Broadcast specs go up to about 50Mbps. MPEG-2 can theoretically be encoded well in 100Mbps, on certain profiles. The MPEG-2 encoding specs also differ in other ways. For example, DVD-Video is using MPEG-2 MP@ML encoding, while HD uses one of the higher profiles. MPEG-2 is more than just bitrate and resolution.
For a good 720p web stream, you can get away with 3-4Mbps, and it looks fine. Youtube, for example, uses less. Hulu is a bit better about bitrates. On a disc, or in broadcast spectrum, it's generally about double that. It will vary from provider to provider, studio to studio.
To see the difference in SD vs HD...
The biggest advantage of HD material is not just in the resolution, but in the quality of color. HD uses a different color profile, and also does not degrade in transit like analog sources did.
The most obvious issue is that HD is all 16:9 widescreen (or put into a 16:9 matte, should it be wider). Older 4:3 content is pillar boxed, with black bars on the sides. Some morons stretch 4:3 content to 16:9 to "fill the screen". I can't stand that; I'd rather not watch it at all, than watch it all stretched out.
I would suggest, based on the above two factors alone, that SD and HD can be seen even at 19" screen. In terms purely of resolution, you'd be looking at something in the 30-40" range before seeing major clarity differences.
Much of this depends on the quality of the set, too. A small Sony set will look exceptionally better than some cheap Vizio or other off-brand.
To see the differences in HD (720p / 1080i vs 1080p)...
With a 55" screen, and 20/20+ vision, I have to flip sources quite a few times to see 720p, 1080i, and 1080p. And even then, I can get it wrong. It's not obvious. Some people insist it's obvious at 30-40 inches. However, it's generally just their imagination. They think it's better, therefore it is. Realistically, about 60" is the minimum size to be able to clearly notice the HD resolution differences. Anything below that, and you're really just guessing.
There is an image law, long known to photographers (because we use it to our advantage!), that an image appears sharper as it gets smaller. The pixel density on a 30-50" screen is so small that 1080p looks more like 720p/1080i anyway. Only when you're able to fully see the individual pixels clearly, will you be able to tell the difference. And to do that, you really need a huge screen. While 60" is big, it's a minimum. It's generally on those big 96-104" projectors that 1080p really helps maintain a clear image.
Again, quality of the set matters. Sony quality will be obvious sooner than something like LG or Samsung. The lower quality sets tend to have more noise in the image (or lack of image clean-up, should the input be the source of the noise). Many people falsely perceive noise as "detail", which is where many arguments on video sharpness originate.