Current estimates of COVID-19 Mortality Rates?

It seems like COVID-19 mortality rates are dropping quickly. Florida had a huge spike in cases, and has had more cases than NY, but far fewer deaths. Since February-April, hospital capacity has opened up, there is more equipment available, and there are new and better treatments available (have the patient lie prone, administer steroids and anti-virals, wait to put the patient on a ventilator all spring immediately to mind).

So, I’m wondering what is the current first-world mortality rate, in places with good hospitals and adequate capacity. The Johns Hopkins page here would seem to over-estimate the current state of the art. Does anyone have a good source for today’s mortality rates?

You can definitely see differences in countries over time. Take the UK for example. Take a look at their daily cases graph over time and compare it with their daily deaths over time. Two different “waves” yielded very different death rates. I assume the big reasons are:

  1. Better treatment as you mentioned.
  2. The first wave’s cases were actually much higher, but few were tested. So the case number is smaller which makes the death rate higher.

I am also curious what the death rate for developed countries has been over the last two months. I am wondering if it might start to approach the yearly flu rates.

On the other hand during the first wave there was a major impact in nursing homes–and the elderly have a high death rate. Currently it is hitting younger generations (who are out socializing, going to school) much more–and they have a lower death rate.

So no one has that, I guess. I know some poster here has been pretty actively pulling down COVID stats, but I can’t remember which one (or I would @ him or her).

Here’s the CDC’s current best estimate of the infection fatality rate (deaths/total infections):
COVID-19 Pandemic Planning Scenarios | CDC (see Table 1 about halfway down the page, “Scenario 5” column)

0-19 years: 0.0003%
20-49 years: 0.02%
50-69 years: 0.5%
70+ years: 5.4%

Here is a Swiss study with similar numbers:
https://osf.io/wdbpe/

5-9 years: 0.0016%
10-19 years: 0.00032%
20-49 years: 0.0092%
50-64 years: 0.14%
65+ years: 5.6%
All: 0.64%

A meta-analysis of IFR studies from around the world:

Under 55 years: “Close to zero”
55-64 years: 0.4%
65-74 years: 1.3%
75-84 years: 4.5%
85+ years: 25%

It’s pretty clear that a single fatality rate across all age groups isn’t very informative. The risk of death rises exponentially with age. In countries like Italy, with one of the oldest populations in the world, the IFR will be higher.

That is a shocking increase with age! I wonder if anyone is trying to untangle co-morbidities from age. So, 65+ people are much more likely to have heart issues, diabetes, etc. – how does their morbidity compare to younger people with the same issues? How do healthy older people compare to healthy younger people?

I’m not asking you to chase that down or anything. Thanks for those stats!

This page from University of Oxford Centre for Evidenced-Based Medicine provides updates on global CFR (deaths/confirmed cases).

At the bottom is a discussion of IFR, as well.

I agree, that would be very good to know. If I come across anything insightful I’ll post it.

The the CEBM page linked above hints at a partial answer to your question:

Modelling the data on the prevalence of comorbidities is also essential to understand the CFR and IFR by age (the prevalence of comorbidities is highly age-dependent and is higher in socially deprived populations). It is also not clear if the presence of other circulating influenza illnesses acts to increase the IFR (testing for co-pathogens is not occurring). And whether certain populations (e.g., those with heart conditions) and also in areas of high social deprivation put people at more risk of dying.

  • In those without pre-existing health conditions, and over 70, the data suggests the IFR will likely not exceed 1%.

That last bullet point seems pretty spot on. Thanks for that!

The second is probably the most important element - the numbers reported were the case fatality rate, not the infection fatality rate.

A third factor that you didn’t mention is that susceptibility to infection is very likely correlated with severity of outcome. So even if we are talking about infection fatality rate, if people who were are more likely to die are also more likely to contract the disease, these people will be overrepresented in the earlier stages. This factor only becomes significant after a large proportion of the population have been exposed.

Can you explain to me why you think that those most susceptible to get infected are the most likely to be the most severe? Not disputing per se just not understanding.

Obviously those most severe are most likely identified but my naive take would be that the most susceptible would be more correlated with having the most likely presence in superspreader events.

I personally think that much of any improvement in true IFR is because early days were a complete failure of protecting those in at risk congregate homes such as nursing homes and that even a modest level of getting our shit together in that regard has had huge impacts.

Not speaking for @Riemann but it seems like that idea needs to get split into two sub-steps.

My likelihood of catching disease is a result of my personal likelihood of exposure multiplied by my personal likelihood of going from exposed to infected (given having been exposed).

I could argue (from precious little solid knowledge) that immune system robustness & skill & individual genetic susceptibility to the pathogen are the determiners of the latter half of that formula. Whereas where & how I live, what the community infection rate around me is, and how (in)cautiously I behave control the former half.

I submit those two halves are not totally independent, at least for judicious people, but they also have quite different driving factors. Any given individual could be strong or weak in either, neither, or both.

Once having split these two halves apart though, I could also believe that high likelihood on the second half correlates with a high likelihood of the disease, having been acquired, going on to become severe. The same weak immune system or genetic susceptibility that permits the disease to get a toe-hold in me also permits it to run farther and faster in me than in someone better situated. That’s “common sense” although we all know how unreliable that is in complex things like biology.

Conversely, a disease this variably severe suggests there’s a lot more variables at work than I just suggested. IOW someone’s proneness to cytokine storms under infectious insult may be orthogonal to someone’s proneness to contracting COVID given sufficient exposure. Again one could have either weakness, neither weakness, or both weaknesses perhaps independently.

Data is of course still not completely clear but it seems to me like teens and young adults catch the virus as easily as do the elderly but are at much less risk of severe disease. Not sure if we have many examples even of that common sense presumption. Considering influenza as an example, kids get infections more frequently than do those 65+ but are at a small fraction of the severe disease or death rate. (Yes, one does need to factor in that they likely have more exposures per individual though …)

It is an interesting thought though and may be true.

OK, so for a 55 year old with no particular health issues, what do you think is the current probability of dying from it?

Based on the information referenced in this thread inclusive of the quote in that Oxford review of “In those without pre-existing health conditions, and over 70, the data suggests the IFR will likely not exceed 1%.” and seeing that 55 is at maybe a tenth of the over 70 numbers overall I’d WAG that IFR at under 0.1%. That’s also consistent with an assumption that the deaths at that age are skewed to those with particular health issues, especially significant obesity.

What’s your best guess?

After accounting for age and comorbidity, there still seems to be substantial unexplained variation in severity. That variation is likely attributable to differences in immune response. For a given level of exposure, a weaker immune response makes infection more likely; and if infection occurs, makes severe outcomes more likely. If that model is correct, we’d expect those with weaker immune responses to be infected more easily and to be overrepresented in the early stages; and therefore the proportion of severe cases to be higher in the early stages. The proportion of severe outcomes and the infection mortality rate would then fall over time, other things being equal.

Different is not necessarily weaker though.

So, for example, as linked to in the breaking news thread, a significant amount of the variation in severity with infection may be related to deficits in Type I interferon from auto-antibodies or genetic deficiencies. That sort of difference has no impact that I know of on the rate of infection while it apparently has a large impact on the severity of infection.

Also in that post was a link to an article detailing how an uncoordinated T-cell response to infection may predispose to greater severity. But note that a more coordinated T-cell response is still one to infection, not preventing infection from occurring.

Why would you think it has no impact on the rate of infection? I would certainly think it’s reasonable go assume that if your immune system is good at fighting the virus once it’s established in your body, your immune system is also more likely to get on top of a small amount of virus that you have been exposed to and destroy it before it replicates and passes the quantitative threshold for a positive test - i.e. prevent infection.

Sure, there are some aspects of the immune system that act in different ways and with variable delay upon exposure. But in general terms, preventing infection and controlling an established infection are about the same thing - how effectively your immune system can destroy virus.

I have literally zero idea, so I’ll go with your guess. I’m around that age and I’m not overweight and don’t have diabetes or heart issues, so here’s hoping you’re right!

With 200k+ deaths and 7mm+ cases, you’d think someone would be slicing and dicing that data in all different (and statistically significant) ways.

Written accurately: while it may make some theoretical sense that a specific Type I interferon deficiency impacts the rate of infection as well as the severity of infection I know of and can find nothing that demonstrates such an impact. What I can find is discussion about how they are produced during viral infections and are key regulators of the response to it. Note that too strong of a Type I interferon response is blamed for infection severity in some cases as well, for example promoting bacterial superinfections after primary influenza infection. IMHO you downplay the differences of how the immune system prevents versus fights off an infection. But tis a fairly minor point and while I don’t know of any evidence supportive of it, you could be right. And it can quickly get pedantic as to when an exposure is labelled an infection that was quickly dealt with vs an exposure that did not lead to an infection, one prevented.