Jeff Nelson

Jeff Nelson

Posted November 23, 2010

Published in Health

  • digg
  • Delicious
  • Furl
  • reddit
  • blinklist
  • Technorati
  • stumbleupon

How some doctors make money

Get VegSource Alerts Get VegSource Alerts

First Name


Email This Story to a Friend

Let's face it, health care in the US is about selling drugs and making big profits for the health care industry.

It's not about getting people information to save their lives or protect their health.

It's about getting doctors to write prescriptions and get payoffs.

Here is a practice that, in a world where health care actually meant "caring for people," this would be banned.

There are services in Hollywood for casting agents and producers to try to find actors to audition for TV, movies and commercials.

Someone who regularly receive these casting notices sent me this jewel this morning. 

Here's what the American health care system is looking for today:

- - - - -


- - - - -

This is what American "health care" is about -- hiring doctors to shill for drug companies.

How can you trust a doctor who is being paid tens of thousands of dollars by a pharmaceutical company to hawk their products?

Where is the disclaimer on the ad, "I was paid to sell you this?"

When you go into the American system of medicine, you are inside of a profit machine, and little else.