I recently saw where an actor I watched growing up has now been hired to discuss politics on a TV station, and it disappointed me because it was a channel I don't like. This is far from the first actor saying something I don't agree with, but it just got me thinking. It does feel disingenuous when they come out with these opinions AFTER they've made a career and built a fanbase on what is, to me, false pretenses. Then again, when actors become activists or speak out for causes I support, I tend to like them even more. Although I can appreciate that actors have every right to speak their opinions, it does change my opinion of them, for better or worse.
Without arguing about specific issues or people (I don't really care to argue about politics no matter what party or views you side with, and I won't engage in that discussion here) -- do you think actors should be political, or should they stick to acting? Does it matter what venue they use, and when?
Without arguing about specific issues or people (I don't really care to argue about politics no matter what party or views you side with, and I won't engage in that discussion here) -- do you think actors should be political, or should they stick to acting? Does it matter what venue they use, and when?