Let me put it this way for you:
Until recently, I was a professional photographer. Long story, lousy industry now, focusing on marketing and hosting these days. Excluding narcissistic kiddies that shoot too many selfies, I probably shot more images on some days than most people shoot in a year. Sometimes it was RAW, sometimes JPEG. My camera gear cost more than most cars.
I find the attitude of "only this" or "only that" to largely be clueless advice, when it comes to image formats. Again, I shot RAW and JPEG. TIFF can be better than JPEG (level 10-12 max quality compression), but how noticeable will it actually be? TIFF is comparable to Huffyuv
for video, while JPEG is comparable to MPEG. But the important difference here is that the images are not in motion. Differences become more negligible. TIFF compared to RAW is laughable; it's NOT the same as was TIFF vs. JPEG in quality.
My cameras shoot anywhere from 12-bit to 16-bit, which is used for RAW, but JPEG is 8-bit. And I have JPEG images that won awards. I shot film year before digital, and won awards in that time as well. Quality images are not determined by compression.
sRGB vs. Adobe 1998 vs. others depends on the print workflow.
Most important aspect = intended use. So, what's your intended use?
If you're working at the Library of Congress, yeah, sure, scan those huge images at max bit depth and resolution. We need valuable photos to be retained for history. They're not scanning tons of images, and can afford the time, equipment and space.
But if you're scanning an old box of home photos, that's insane. I scan our old home photos as 8-bit sRGB JPEG in most cases, and only do TIFF or DNG for images in need of restoration. And I'm still not done with that project. It's taking forever. I've dedicated full days to it in the past, and barely dented a box of images among many.