I’ve gotten to the point where I don’t even get upset anymore. I am so used to being bombarded with this propaganda it is just accepted. The subject I’m talking about isn’t the normal left-right debate. It’s the entertainment media.
Has anyone else noticed that all the bad guys are rich? There isn’t a “good” rich guy anywhere. The star is usually a poor guy or a middle class guy, but never a rich business guy.
On television or in the movies the lesson that is constantly taught is no one can accumulate wealth without being evil and corrupt. The rich guy murders, bribes, steals, and shame of all shame, rapes the environment. The rich guy drives a Cadillac and wears a suit. He always has a gun.
We are constantly being ingrained with the notion that financial success is only accomplished with evil. Every successful businessman is guilty of major crimes. How else could they gain their status? Hard work, long hours, and taking financial risks are never shown in the stories.
This is a form of brainwashing the gullible viewer. If the message is repeated enough, in enough ways, it eventually becomes truth with the viewer. My real problem is what this is doing to the American youth. These people are the corporate leaders of tomorrow. They grow up with the expectation that business is cutthroat. The moral underpinning of the country is being destroyed. These people practice the ethics that they accept as the way business works.
When I started my career, business was sometimes done with a handshake. Your word was your bond. Today things have changed, Everything is done with a written contract. If you don’t have your corporate attorney go over all the fine print, then you get exploited. There is no morality anymore. We are all poorer for it.