patoowardfan
Active member
You ever watch a show expecting just some entertainment, but then it hits you with something deep and totally shifts how you see a certain topic? For me, it was The Good Place. I went in thinking it was just a quirky comedy, but it made me think so much about ethics and what it really means to be a “good” person. What’s a show that unexpectedly made you rethink something?