The cameras seem to follow actors and celebrities, and as a nation we seem to hang onto their every word and action. They do a lot of harm by portraying characters and situations which are less than wholesome, and they get Oscars, Grammy awards, and Baftas for speaking words written for them to say. They don’t have an original thought in their ‘job’. The clue is in the name. Another well known UK soap actor, followed by millions, has been charged with serious counts of sexual abuse including child rape.
There are magazines dedicated to following the lives of the rich and famous, who seldom do anything to help anyone but themselves. They live in the rarified atmosphere of film studios and our TV screens, and we elevate them to positions of sainthood. Why?
Occasionally, an actor or celebrity will take a trip, usually between jobs, to be seen doing some ‘good’ to the underprivileged in a third world country, and we gasp in admiration and adoration at their efforts. Our hero(ine)s are actually good people. Maybe they are, but I can’t help but go back to the title of their day job. They are actors. How can I trust that they are really interested in the children they are photographed with? Or is it another case of a career needing a boost of good news before they go back to playing the seedy roles they are so good at? Where is the truth? Does it lie in their scripted words? Does it lie in their good works while being followed by a film crew among disadvantaged children? Is their concern real, or another opportunity to be seen by their adoring fans who can now relate to how kind they really are when they pay to see their next 18 rated movie?
A recent UK poll showed the least trusted professions as: bankers, politicians, journalists, and estate agents, but actors/celebrities do not figure in the top 10, so am I out of line when I ask you, “what is truth, and who do YOU trust to speak it”? If we cannot trust these professionals, who or what would YOU place your trust in? Just asking.