Like it already is in many countries?
If you drive a car you MUST have car insurance because it is required by law in case you have an accident.
Why isn't it a law that everyone must have health insurance because if you don't you can't earn money to pay taxes and often instead will just die.
So, it is in the financial interests of the government that you have health insurance so you can pay taxes as a U.S. citizen.
So, why isn't it a legal requirement like in many democracies that everyone has Health Insurance?
To the best of my ability I write about my experience of the Universe Past, Present and Future
Top 10 Posts This Month
- Rosamund Pike: Star of New Amazon Prime Series "Wheel of Time"
- Belize Barrier Reef coral reef system
- SNAP rulings ease shutdown pressure as Thune rebuffs Trump call to end filibuster
- Pacific Ocean from Encyclopedia Britannica
- Flame (the Giant Pacific Octopus) whose species began here on earth before they were taken to another planet by humans in our near future
- Learning to live with Furosemide in relation to Edema
- I put "Blue Sphere" into the search engine for my site and this is what came up.
- Nine dead, dozens injured in crowd surge at Hindu temple in southern India
- Siege of Yorktown 1781
- Costco sues the Trump administration, seeking a refund of tariffs
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment