Half a croissant, on a plate, with a sign in front of it saying '50c'
h a l f b a k e r y
Fewer ducks than estimates indicate.

idea: add, search, annotate, link, view, overview, recent, by name, random

meta: news, help, about, links, report a problem

account: browse anonymously, or get an account and write.

user:
pass:
register,


                                                 

Disaster Mortality Scale

A log scale for disasters to allow scale comprehension
  (+10, -1)(+10, -1)
(+10, -1)
  [vote for,
against]

The human brain has difficulty comprehending and comparing numbers over large scales. Once numbers get too big many people treat them all as the same, sometimes with unfortunate consequences.

The power of earthquakes is traditionally reported using the Richter scale[1]. This is useful because it converts power over many terms of magnitude into small numbers easily comprehended by the human brain.

I propose a measurement scale for disasters which is simply the base 10 logarithm of (10 times the number of human deaths) [2][3]. This yields the following values:

deaths : DMR (disaster mortality rating)
1 : 1
10 : 2
100 : 3
1000 : 4

Since people generally seem to comprehend earthquake powers, this should allow more rational understanding of the scale of any disaster.

September 11 attacks 2001 : 4.5
UK road fatalities 2002 : 4.5
Bhopal disaster: 4.6
Japan Earthquake and Tsunami 2011 [4]: 5.4
USA road fatalities 2002 : 5.6
Boxing Day Tsunami 2004 : 6.4
Second Gulf war (excess deaths) : 6.8
Destruction of Earth with current population: 10.8

[1] Although for scientific measurement it has been superceded by a similar but more useful scale.
[2] The system could - or should - be extended to sub-lethal disasters by also taking morbidity into account. This would, however, have to be reported as a separate value. If a mortality measurement is DMR (Disaster Mortality Rating), then this value would be DMMR (Disaster Mortality and Morbidity Rating)- both could be reported, if available.
[3] I include the multiplication by 10 to get a non-zero value for a single fatality. Take the log and add one if you prefer.
[4] Includes 'missing'.

Loris, Apr 01 2011

Immortality Tax Immortality_20Tax
If people could multiply, we might fund enough medical research [sninctown, Dec 07 2013]

Please log in.
If you're not logged in, you can see what this page looks like, but you will not be able to add anything.
Short name, e.g., Bob's Coffee
Destination URL. E.g., https://www.coffee.com/
Description (displayed with the short name and URL.)






       Brilliant
pocmloc, Apr 01 2011
  

       So I can paralyze 50,000 people without even getting a "1"? Excellent...
Voice, Apr 01 2011
  

       The Wreckeder scale.   

       not enough spread. If 1 is the basic standard for human tragedy (say a bus crash) and 4.5 represents tragedies of national importance we start to cluster really hard. That the famine death of a few millions of people could be so near a minor natural disaster and also so near the extinction of the species makes this scale even worse than the current US system of weights and measures. After all in the wider perspective our inability to grasp the scale of these things isn't going to be helped by putting them so close together any more than when we separated them with large numbers.   

       "Imagine that I ripped you open and took out your heart, now Imagine that I did that 1x10^6th times, do you understand how painful that would be?"
WcW, Apr 01 2011
  

       Good idea.   

       Afghanistan UN Compound zerg rush: 2.2
rcarty, Apr 01 2011
  

       Birth: -1
Twins: -1.3
Birth of a nation: -8
MaxwellBuchanan, Apr 01 2011
  

       Origination of universe: -10.2
WcW, Apr 02 2011
  

       I agree with WcW that there's not enough spread. Also, events that occur over a time period (e.g. annual car accident fatalities) should have their own scale and not be merged with this one.
phundug, Apr 02 2011
  

       I think it's very good, and I'm not worried about the 'spread'. We have no problem differentiating between yesterday's earthquake in Blackpool in the UK of magnitude 2.2 and something pretty bad like the 6.3 magnitude quake in Christchurch.

Nasty cold: -2

[Max-B] Saying "Birth: -1" doesn't make any sense. The reverse logarithm of -1 is 0.1, which we then divide by 10 to get 0.01, so it's suggesting that a birth is the equivalent of 1% of a death.
hippo, Apr 02 2011
  

       [Ian]'s definitely right about the time and the normalisation is nice for an immediate comparison- but it doesn't let you work 'back'.   

       Maybe some kind of 2 number system to describe short term and long term factors could be used? An 'impluse' scale for day 1 casualty/damage and a 6-month or 12 month 'tail' response number to describe the aftermath.   

       The impulse number could describe mortality directly due to the event and be a straightforward logarithmic scale. The tail value could be a letter and number to indicate the distribution and magnitude of mortality beyond the impulse. This is still a very raw thinking and I am not sure what distributions could be short-handed.   

       An 'instant' disaster such as an explosion may have high initial component but few directly related deaths in the following weeks. A disaster with a poisonous or toxic effect (chemical or radiation) will have a longer tail with exposure effects and some kind of half-life. For events that are not singular (car accident deaths) then the impulse would be very small but the tail may be significant. I'm not sure how the Pacific Tsunami of 2004 would be described.   

       So, maybe something like:
9/11 Terrorist attacks : I=3.5, T=R2 (meaning around 3.2 thousand deaths immediately and a Residual in the order of 10s in the following weeks/months)
Bhopal : I=3.6, T=D3.0 (meaning around 3.9 thousand deaths immediately, followed by a Decaying death toll in the order of a thousand)
Annual Car deaths in the UK : I=0, T=B3.0 (meaning that singular event is in order of 1 to 10 and that a Boxcar, or uniform, death toll in the order of a thousand for the rest of the year)
  

       {The above values are log10, not 1+log10 as in idea body - I can change them if that is the more accepted scale}
Jinbish, Apr 02 2011
  

       [Jinbish] So, if the consequences are a function f(t) of time, you're suggesting we use the area under the curve rather than the peak?   

       f(t) for traffic mortality would be a boxcar (as you say) but for one-off mass deaths it would be a Dirac delta. How would one choose f(t) in general?
mouseposture, Apr 02 2011
  

       //suggesting we use the area under the curve rather than the peak?//   

       Only for the longer term analysis - although I am not sure myself what the boundary between short and long term is. That's why I think two values might be useful.   

       //How would one choose f(t) in general?// Not sure. I made the examples up as I went along in an effort to get the basic thought out as an anno. I'm thinking of perhaps 3 or 4 basic f(t) models.
R: A plain residual value with undetermined function (but a total of X over time)
D: A decaying value over time (not sure if differerntiating between linear and exponential is useful)
B: A uniform value per unit of time (annually).
Jinbish, Apr 02 2011
  

       //The reverse logarithm of -1 is 0.1, which we then divide by 10 to get 0.01//   

       Well, technically if the scale is logarithmic it can't cover negative values at all. However, the stated idea was to take the log of the number of deaths, then add 1 to it (or, which amounts to the same thing, multiply the number of deaths by 10 then take the log).   

       So, a value of 1 gives a result of 1 (log 1=0; +1 = 1....or, if you prefer, 1x10=10, log 10 = 1). Logically, therefore, one birth would be -1.
MaxwellBuchanan, Apr 02 2011
  

       I recall reading about the US State Department developing something very much like this during the cold war to predict how the survivors of a nuclear weapons exchange would react. I can't remember which of my 1000+ books I read this in or which moldy cardboard box above my garage contains said book, so unfortunately I cannot be more specific at this time (Google, predictably, was of no help whatsoever).   

       Does anyone know what I'm talking about? If not, please do what you usually do and completely disregard my anno.
Alterother, Apr 04 2011
  

       //So, a value of 1 gives a result of 1 (log 1=0; +1 = 1....or, if you prefer, 1x10=10, log 10 = 1). Logically, therefore, one birth would be -1.// - but you're just using the logic for calculating the value on this scale equivalent to a number of deaths and sticking a minus sign in front of it. That's not what a negative logarithm means on this scale. If someone suffers a nasty knock on the head, for example, equivalent to 1% of a death, then the number of deaths is 0.01, the logarithm of (0.01) is -2 and so the value on the scale is -1.
hippo, Apr 04 2011
  

       I'm not sure the log10 thing helps, it keeps the numbers low, but as others have pointed out, this packs in pretty much everything short of global catastrophe in at <5.   

       I think the time element is important, and quite like [Jinbish]'s idea of presenting a curve of deaths/time. This too could be condensed into a single number showing the overall average either over the whole event, or its tail, or both - a deaths per minute - type measure allows us to compare terrorism vs tsunami vs car-accidents - which is probably close to what we do already with annual statistics. It's just that we tend to make a clear distinction between 'one-off' events that occur once or twice (or less) in a lifetime, and those that occur every day. In other words, if the frequency of occurance is close to the generational 'refresh' time (probably +20 years or so) then it's treated individually, rather than stochastically.
zen_tom, Apr 04 2011
  

       We wish to propose merging the various geophysical scales for damaging events (Mercalli, Fujita, Hurricane, VEI etc.) into a single" Awshit!" Scale for use by the mass media.   

       Few members of the general public understand the implications of an F3 tornado or a VEI4 eruption. A homogenous " disaster scale" would be much more useful.
8th of 7, May 20 2011
  

       Employees would be rated in milli-oops per second.
mouseposture, May 20 2011
  

       I want to share my "death is bad; do more science!" rant, but I'll just say that I wish most people could comprehend large numbers and appreciate the value of science.
sninctown, Dec 07 2013
  

       And by counting the news choppers and news vans with "live" newsperson, you could state if the media was under or overreacting to the event.   

       I like it.
popbottle, Jun 18 2014
  

       //I want to share my "death is bad; do more science!" rant//   

       Please do
Voice, Jun 18 2014
  

       [hippo] said " I'm not worried about the 'spread' "   

       Obviously not a sports gambler then.
normzone, Jun 18 2014
  

       //it's treated individually, rather than stochastically.//   

       This leads to the idea of a "Personal Impact Scale" in which the event is characterised by its degree of impact on the average viewer (c.f. "Small war, very far away").   

       Thus "Earthquake in Chile, 100,000 dead, 500,000 homeless, anarchy supervenes" might rate a 1.3 to Europeans, whereas a "Flood in China, no more shipments of big screen LCD TVs until 2016" could top a 7 in the USA and other G7 members ...
8th of 7, Jun 20 2014
  


 

back: main index

business  computer  culture  fashion  food  halfbakery  home  other  product  public  science  sport  vehicle