I have never seen doctor who. What I know really doesn't speak to me, it seems too "random new being is the answer" to me. No hate against the show. I mean, I am fairly certain that I am basically complaining that a romance book is romantic.
That being said, doctor who seems to be quite obviously saying, be maximally tolerant and fight the tyrants and intolerant. which seems woke at this point.
When I was a child, we called it, not being a dick, but I guess it is woke now.
I hope I didn't make you feel like you had to justify your enjoyment of the show.
My point was, I haven't seen it but even I picked up on "woke" themes in the show and it is crazy to act like it became "woke".