Google Search Console says "blocked" by robots.txt?
I'm new to SmugMug and not yet very far up the SEO learning curve, so cut me some slack. The first thing that I notice on GSC is that it currently reports 44 pages indexed but 159 "blocked" by robots.txt. And it shows me a robots.txt for my SM site that indeed seems to block many things.
Is this real? Why would SM be using a robots.txt?