Unashamed Christians
21-07-2004, 14:24
I think the American media would be a lot better off if they would just come out and admit their bias. Correct me if I'm wrong, but most newspapers in England generally admit their bias or you can easily tell by their writing. I think it is truly ludicrous to say that you are unbiased. To make that claim goes against everything in human nature, everyone sees things through the lense of their own experiences and worldviews. Journalists are no different. Plus, I think the American public would respect its mainstream media a lot more if they just admitted their bias.
Pinkoria
21-07-2004, 14:28
I'd have to agree with you completely. Objectivity is a myth perpetuated by the American media. Nothing that is filtered through the eyes of an individual (such as a reporter) can ever be objective. Therefore, it is ludicrous to assume that every media source will look at issues in the same way. It is also hypocritical for them to claim objectivity while obviously reflecting personal biases, such as Fox News.
And you're right. In Britain, you have the Times of London, which is right-wing, and the Independent/The Guardian which are left-wing. Neither one of these three publications claim they are "objective".