The reason why American Healthcare is so expensive

American Healthcare is the most expensive healthcare of the world. This could be could of two reasons:

  1. The healthcare is poorly organised
  2. USA has a higher disease load

I think the main reason is reason number 1, which coincidentally also creates 2 creating even higher healthcare expenses.

The reason

A large group is uninsured. This drives up the costs for everyone.

Imagine 1,000 Uninsured People. Of those 1,000 people 30 develop chest pain. Because they are uninsured, they do not seek medical care. Of those 30, 10 people develop an myocardial infarct and present at the ER. The hospital has to treat them and for costs are made for 10 Myocardial infarcts and the longterm results.

Contrast this to the next example:

A large group is insured. This will dampen the costs for everyone.

Imagine 1,000 insured people. Of those 1,000 people 30 develop chest pain. Because they are insured, they seek medical care. They are properly treated by cardiologists. This proper treatment prevents myocardial infarction, only 1 person in the 30 develops a myocardial infarct. This not only increases proper healthcare, it also decreases total costs made in the

Comparison with cleaning your room:

If you clean your room everyday, it is an easy task and costs  only 5 minuts

However if you wait really long, it becomes a way harder task. Every week, it will take you an hour. Compare this to 7 times 5 = 35 minutes. Ignoring early problems will result in way bigger problems down the road.

Multiple factors:

People of low socioeconomic class have a way higher disease burden. Coincidentally, this group is more often uninsured. This exponentiaties the cost.

Question:

Is this a dichotomous effect (insured vs uninsured) or can there be a comparable continuous effect found in healthcare with universal insureance? This can be intersting for ways to reduce costs in universal healthcare, which in my eyes is the gold standard for a good health economy.

Bonus video with extra effect:

Some quotes about Entropy I found

How do you explain entropy to the common person?

There’s a great (probably apocryphal) story on this. Supposedly Shannon and John von Neumann were having a conversation at Princeton in 1940, when Shannon was first piecing his theory together. Shannon approached the great man with his ideas on the problem of quantifying information, and how it related to uncertainty, and then asked what he should call this thing. Von Neumann answered at once: say that information reduces “entropy.” For one, it was a good, solid physics word. “And more importantly,” he went on, “no one knows what entropy really is, so in a debate you will always have the advantage.”

Now, as I noted, our research suggests that this conversation didn’t exactly happen in those terms. But as we put it in the book, “good science tends to generate its own lore.” And the reason this piece of lore has stuck around is that entropy is such a fuzzy concept.

Here, in very rough terms, is how Shannon uses it. He says that information resolves uncertainty, so the messages that resolve the highest amount of uncertainty convey the greatest amount of information. For example, the outcome of flipping a fair coin is more uncertain than the outcome of flipping a weighted coin, so the fair coin stores more information. This implies that, contrary to our ordinary language use of the word “information,” a string of random-looking text contains more information than a string of comprehensible text (which is structured by all sorts of rules and patterns that make it predictable). If we take entropy to mean “disorder” (and I can hear physicists wince as I try to sum up a difficult concept like that), then the most disorderly-looking messages are the most information-rich in Shannon’s terms. By contrast, more patterned messages are lower in entropy, and less information-rich in Shannon’s terms.

As I said, that’s a very, very rough outline. But I’d suggest that Shannon really uses “entropy” in his work as something of a metaphor, taking a concept from the physical sciences and finding it as a useful explanatory concept in information theory. (I’d also point out that he had predecessors in this regard, including Norbert Wiener and Leo Szilard.)

 

How I understand entropy is explained by the Biographers of Shannon in a later comment. It is a very lucid concept actually. Entropy = information. To measure the information you count the number of yes/no-question you have to ask to fully understand a system. Now I do wonder, when do you know that you fully understand a system?

And also, you can measure the amount of information by counting how many yes/no questions the recipient would have to ask the sender to figure out what the message is. For example, if Alice shuffles a standard deck of 52 cards thoroughly, and takes a secret peek at the top card in the deck, Bob would need to ask Alice 5 or 6 yes/no questions to figure out for sure which card it is, for example:

  1. “Is the card’s suit red?” (Answer: no.)
  2. “Is the card’s suit spades?” (Answer: yes.)
  3. “Does the card have a number less than or equal to 8?” (Answer: yes.)
  4. “Does the card have a number less than or equal to 5” (Answer: no.)
  5. “Does the card have the number 6 or 7?” (Answer: yes.)
  6. “Is the card six of spades?” (Answer: no.)

So in this example, after six yes/no questions and truthful answers, Bob knows for sure that Alice’s card must be the seven of spades. With Shannon’s theory we can shortcut this counting process—for a card chosen at random with equal probability as all the other cards in the deck, the average number of yes/no questions needed is the base two logarithm of the number of equally likely alternatives: log2(52) ≈ 5.7 questions on average.

So if instead of making Bob play this little game, Alice just told him the randomly chosen card, the amount of information she would be giving him is the same as the average number of questions in the game: 5.7 bits.

URL