So we were talking about excess deaths, which means that supporting your argument with a paper that argues that a previous finding of excessive deaths was flawed is probably not the strongest argument you could make.
Increased number of injuries but not deaths could be, for example, (purely making things up off the top of my head here) due to higher levels of distractedness among average drivers due to fear of terrorism, which results in more low-speed, surface-street collisions, while there’s no change in high speed collisions because a short spell of distractedness on the highway is less likely to result in an accident.
> which results in more low-speed, surface-street collisions, while there’s no change in high speed collisions because a short spell of distractedness on the highway is less likely to result in an accident.
That's not a remotely plausible model though. There are recorded cases of e.g. 1.6 seconds of distractedness at high speed causing a fatal collision. Anything that increases road injuries is almost certainly also increasing deaths in something close to proportion, but a given study size obviously has a lot more power to detect injuries than deaths.
Yeah, I was not really trying to argue that that was actually the case, so won’t waste time trying to defend the merits of the model I pulled out of my arse.
Alternatively then, perhaps safety developments in cars made them safer to drive around the same time? Or maybe advances in medicine made fatal crashes less likely? Or perhaps there’s some other explanation that doesn’t immediately spring to mind, it’s irrelevant.
The only point I’m really making is that the data OP referred to does not show an increase in excess deaths, and in fact specifically fails to find this.
> The only point I’m really making is that the data OP referred to does not show an increase in excess deaths, and in fact specifically fails to find this.
That data absolutely does show an increase in deaths, it's right there in the table. It fails to find a statistically significant increase in deaths. The most plausible explanation for that is that the study is underpowered because of the sample size, not that some mystical effect increased injuries without increasing deaths.
> That data absolutely does show an increase in deaths
Please be careful about removing the word 'excess' here. The word 'excess' is important, as it implies statistical significance (and is commonly understood to mean that - https://en.wikipedia.org/wiki/Excess_mortality).
I didn't argue that there was no change in the number of deaths, and I did not say that the table does not show any change in the number of deaths.
Notably, although 2002 had a higher number of fatalities, the number of miles traveled by road also increases. However, it represents a continuation of a growing trend since 1980 which continued until 2007, rather than being an exceptional increase in distance travelled.
Also, while 2002 was the worst year since 1990 for total fatalities, 2005 was worse.
Fatalities per 100 000 population in 2002 was 14.93, which was around a 1% worsening from the previous year. But 2002 does not really stand out, similar worsenings happened in 1988, 1993, 1994, 1995, 2005, 2012, 2015, and 2016 (to varying degrees).
One other observation is that in 2000, there was a population increase of 10 million (from 272 million to 282 million), while other years on either side are pretty consistently around 3 million. I'm not sure why this is the case, but if there's a change in the denominator in the previous year, this is also interesting if we're making a comparison (e.g. maybe the birth rate was much higher due to parents wanting 'millenial babies', none of whom are driving and so less likely to be killed in a crash, again, just a random thought, not a real argument from my side that I would try to defend).
The reason 'statistical significance' is important is because it allows us to look at it and say 'is there a specific reason for this, or is this within the bounds of what we would expect to see given normal variation?'. The data don't support a conclusion that there's anything special about 2001/2 that caused the variation.
So what is your actual model here? Do you believe that the rate of injuries didn't actually go up (despite the statistically significant evidence that it did), that some mysterious mechanism increased the rate of injuries without increasing the rate of deaths, or that the rate of deaths really did increase, but not by enough to show in the stats (which are much less sensitive to deaths than to injuries, just because deaths are so much rarer)? I know which of those three possibilities I think is more plausible.
We are talking about excess deaths, not injuries. It feels like the direction you are taking is going further and further away from the original point, and you are misrepresenting what I have said to fit your argument.
I did not say that injuries did not go up. I did not say that the rate of deaths did not go up. I did not say that the rate of deaths did not “show in the stats” (unless by that you mean “was not statistically significant”).
I don’t need a model for the injuries question because that isn’t the point we’re arguing, but I might suggest something like “after 9/11, people took out more comprehensive health insurance policies, and so made claims for smaller injuries than they would have in previous years”.
A suitable model for explaining excess deaths might be something like “after 9/11, people chose to drive instead of fly, and driving is more dangerous per mile than flying is, resulting in excess deaths”. I’m not sure if that is your exact model, but it’s typically what people mean, happy for you to correct.
The problem with that model is that there’s no statistically significant increase in miles driven either. I can’t think of a model which would explain a higher fatality/injury rate per mile driven.
Out of interest, what would be your model for explaining why there were more injuries per mile?
If you found a single story of someone deciding to drive instead of take a plane in October 2001 because of 9/11, and that person died in a car crash, would that be enough for you to be satisfied that I am wrong?
> The problem with that model is that there’s no statistically significant increase in miles driven either. I can’t think of a model which would explain a higher fatality/injury rate per mile driven.
> Out of interest, what would be your model for explaining why there were more injuries per mile?
Well, is there a statistically significant difference in the injuries per mile? Or even a difference at all? That the difference in injuries was statistically different and the difference in miles driven wasn't does not imply that the former changed by a larger proportion than the latter.
Pretty much everyone including all of the papers we've talked about assumes there was an increase in driving. Do you actually think driving didn't increase? Or is this just another area where these concepts of "statistically significant" and "excess" are obscuring things rather than enlightening us?
> If you found a single story of someone deciding to drive instead of take a plane in October 2001 because of 9/11, and that person died in a car crash, would that be enough for you to be satisfied that I am wrong?
I'm interested in knowing whether deaths actually increased and by how much; for me statistical significance or not is a means to an end, the goal is understanding the world. If we believe that driving did increase, then I don't think it's a reasonable null hypothesis to say that deaths did not increase, given what we know about the dangers of driving. Yes, it's conceivable that somehow driving safety increased by exactly the right amount to offset the increase in driving - but if that was my theory, I would want to actually have that theory! I can't fathom being satisfied with the idea that injuries somehow increased without increasing deaths and incurious not only about the mechanism, but about whether there really was a difference or not.
All of these things - miles driven, injuries, deaths - should be closely correlated. If there's evidence that that correlation actually comes apart here, I'm interested. If the correlations hold up, but the vagaries of the statistics are such that the changes in one or two of them were statistically significant and the other wasn't, meh - that happens all the time and doesn't really mean anything.
The study does not quote miles driven in their specific sample so I can’t talk to this or answer your question directly, and the national numbers don’t cover injuries. The numbers in the survey are also a bit vague about injuries, and include “possible injuries”, “injury with severity unknown”, and they are taken separately from incapacitating injury data. The data on injuries is objectively lower quality than the data on deaths, and anyway deaths was the focus of the original question, so I’d rather avoid getting sidetracked.
To respond to your question (admittedly not to answer directly), nationally there was indeed an increase in miles driven, an additional 59 billion miles in 2002 compared to 2001, and indeed there was an increase in deaths. I would also expect an increase in injuries as well.
Looking at this in isolation, you can say “oh so because of 9/11, people drove 59 billion more miles which resulted in more deaths and injuries”, but in my opinion real question if you want to understand the world better is “how many more miles would folks have driven in 2002 compared to 2001 in case 9/11 never happened”.
We obviously can’t know that, but we can look at data from other years. For example, from 1999 to 2000, the increase in miles driven was 56 billion, from 2000 to 2001, the increase was 50 billion and from 2002 to 2003 the increase was 34 billion, from 2003-2004 the increase was 75 billion.
Miles driven, injuries, deaths, are indeed all closely correlated. But so is population size, the price of oil, and hundreds of other factors. If your question is “did more people die on the roads in 2002 than in 2001”, the answer is yes. Again, I assume that the same is also true of injuries although I can’t support that with data.
That wasn’t OP’s assertion though, OP’s assertion was that closing down airspace does not lead to zero excess deaths. My argument is that the statistics do not support that conclusion, and that the additional deaths in 2002 cannot rigorously be shown even to be unusually high, let alone caused by 9/11.
> That wasn’t OP’s assertion though, OP’s assertion was that closing down airspace does not lead to zero excess deaths. My argument is that the statistics do not support that conclusion, and that the additional deaths in 2002 cannot rigorously be shown even to be unusually high, let alone caused by 9/11.
What we can show rigorously and directly is a tiny subset of what we know. If your standard for saying that event x caused deaths is that we have a statistically significant direct correlation between event x and excess deaths that year, you're going to find most things "don't cause deaths". Practically every dangerous food contaminant is dangerous on the basis of it causes an increase in x condition and x condition is known to be deadly, not because we can show directly that people died from eating x. Hell, even something like mass shootings probably aren't enough deaths to show up directly in death numbers for the year.
I think it's reasonable to say that something we reasonably believe causes deaths that would not have occurred otherwise causes excess deaths. If you actually think the causal chain breaks down - that 9/11 didn't actually lead to fewer people flying, or that actually didn't lead to more people driving, or that extra driving didn't actually lead to more deaths - then that's worth discussing. But I don't see any value in applying an unreasonably high standard of statistical proof when our best available model of the world suggests there was an increase in deaths and it would actually be far more surprising (and warrant more study) if there wasn't such an increase.
> If your standard for saying that event x caused deaths is that we have a statistically significant direct correlation between event x and excess deaths that year
It isn't.
> Hell, even something like mass shootings probably aren't enough deaths to show up directly in death numbers for the year.
Yes, there's (probably) no statistically significant link between mass shootings and excess deaths (maybe with the exception of school shootings and excess deaths in the population of school children). But you can directly link 'a shooting' and 'a death', so you don't need to look at statistics to work out if it's true that shootings cause deaths. Maybe if mass shootings started to show up in excess deaths numbers, we'd see a different approach to gun ownership, but that's a separate discussion. You can't do the same with 'closing airspace' and 'deaths on the road'.
When you're looking at a big event like this, there's a lot of other things that can happen. People could be too scared to take trips they might otherwise have taken (meaning a reduction in mileage as folks are not driving to the airport), or the opposite, 9/11 might have inspired people to visit family that they otherwise might not have visited (meaning in increase in mileage). Between 2000 and 2003, the price of gas went down, which might have encouraged people to drive more in general, or to choose to drive rather than fly for financial reasons (although if you wanted to mark that down as 'caused by 9/11' that's probably an argument you could win). You can throw ideas out all day long. The way we validate which ideas have legs is by looking at statistical significance.
Here's some more numbers for you, in 2000, there were 692 billion revenue passenger miles flown in the US. In 2002, it was 642 billion. So we can roughly say that there were 50 billion fewer miles flown in 2002. But the actual number of miles driven in 2002 was 100 billion higher than in 2000 (and note, this is vehicle miles, not passenger miles, whereas the airline numbers are counting passengers). So clearly something else is at play, you can't (only) attribute the increase in driving to people driving instead of flying.
> If you actually think the causal chain breaks down - that 9/11 didn't actually lead to fewer people flying, or that actually didn't lead to more people driving, or that extra driving didn't actually lead to more deaths - then that's worth discussing
I believe that the causal chain does exist but it's weakened at every step. Yes, I think 9/11 led to fewer people flying, but I think that only a proportion of those journeys were substituted for driving, and some smallish percentage of that is further offset by fewer journeys to and from the airport. I think that the extra driving probably did lead to a small number of additional deaths, but again this question of 'is one additional death enough for you to think I'm wrong' comes back.
If I throw aside all scientific ways of looking at it, my belief is that in terms of direct causation, probably, in the 12 months following 9/11, somewhere between 50 and 500 people died on the roads who would not have died on the roads if 9/11 had not happened. But a lot of those were travelling when the airspace was not closed.
If we look at the number of people who died because they made a trip by car that they would have otherwise made by plane but couldn't because US airspace was closed (i.e. on 9/11 itself and during the ground stop on the 12th), you're looking at what I believe to be a very, very small number of people, maybe even zero.
It's true, not saying they did a bad job here, just that even minor problems in your code can exacerbated into giant net effects without you even considering it.