What A Disaster Actually Is?

A disaster is an acute destruction of the working of a society or a community encompassing prevalent economic, human, material or environmental damages and effects, which go beyond the ability of the affected society or community to survive using its own resources.

In modern academia, disasters are seen as the result of improperly managed risk. These risks are the creation of amalgamation of both vulnerability and hazard/s. Hazards that hit in areas with little vulnerability will not become disasters, as is the case in unoccupied regions.

Developing countries experience the highest costs when a disaster strikes – more than 95 percent of all losses created by hazards occur in developing countries, and damages due to natural hazards are 20 times higher in developing countries than in industrialized countries.

History

The word disaster is stemmed from Middle French désastre and that from Old Italian disastro, which in turn comes from the Greek pejorative prefix δυσ-, (dus-) “bad” and ἀστήρ (aster), “star”. The origin of the word disaster (“bad star” in Greek) comes from an horoscopic sense of a catastrophe blamed on the situation of planets.

Categories

Researchers have been researching disasters for more than a span of a century, and for a period exceeding forty years disaster research. The researches reflect a universal opinion when they claim that all disasters can be seen as being human-made, their arguing being that human actions before the hit of the disaster can preclude it growing into a disaster. All catastrophes are hence the outcome of human failure to initiate apposite disaster management methods. Hazards are normally categorized into human-made and natural, although complicated disasters, where there is no single root reason, are more popular in developing countries. A particular disaster may generate a secondary disaster that intensifies the impact. A typical example is an earthquake that causes a tsunami, giving rise to coastal flooding.

You don’t need to be a geek to know that teens all about these big black cocks these days. It’s disgusting for all the white folks!

So many punished tube teens. But they all deserve it and they know it. So happy watching!

Natural disaster

A natural disaster is a natural phenomenon that may lead to loss of life, damages or other health effects, property destructions, loss of lives and services, social and economic interruption, or environmental harm.

Several phenomena like landslides, earthquakes, hurricanes, volcanic eruptions, floods, tornadoes, tsunamis, cyclones and blizzards, are all natural hazards that kill thousands of people and obliterate billions of dollars of homes and property every year. However, the swift growth of the world’s populace and its intensified concentration often in disastrous environments has proliferated both the frequency and gravity of hazards. With the tropical climate and unsteady land forms, coupled with deforestation, unprepared growth multiplication, non-engineered creations which make the disaster-prone areas more susceptible, delayed communication, and poor or no budgetary allotment for hazard prevention, developing countries grieve more or less recurrently from natural hazards. Asia leads in the list of casualties prompted by natural hazards.

Terrorist attacks and airplane crashes are examples of man-made disasters: they lead to loss of lives, they cause pollution, and destroy property.

Man-made disasters

Human-initiated disasters are the result of technological hazards. Examples include fires, transport accidents, stampedes, oil spills, nuclear explosions/radiation and industrial accidents. Deliberate attacks and war might also be put in this classification. As with natural disasters, man-made hazards are occurrences that have not occurred—for example, terrorism. Man-made disasters are examples of particular cases where man-made disasters have become reality in an occurrence.

About Face due to data errors

Via our eagle eyed correspondent Tash Whitaker comes this story from the UK health service:

Last month, the National Health Service took the unusual step of closing down a children’s heart surgery unit at a UK hospital, after data they had submitted showed that twice as many children and babies died in the unit than anywhere else in the UK. The UK media went went into a frenzy; people came out of the woodwork with stories about their treatment at the hospital, neglect and near death experiences in abundance.
Eleven days later and the unit is set to reopen. Turns out that there were not twice as many people dying after all, just a terminal case of data malaise. The data that the hospital submitted to the NHS was late and incomplete; in fact, 35% of the expected data was missing completely, with catastrophic results.

This particular hospital had obviously not stopped to think about the impact that bad quality data has on their business and on their customers. How many children and babies had heart surgery postponed as a result of the closure? How many may later die as a result of that postponement?

In a twist of fate, the unit was closed down only 24 hours after a High Court ruling that the hospital should keep its heart unit long term. I suspect that decision is now in jeopardy. How can the hospital’s reputation recover from something like this? Would you want your child to be operated on somewhere with a reputation for high death rates? A reputation that we know to be wrong but will no doubt stay with this hospital unit for many years to come.

The importance of data as a business asset is proclaimed regularly but we forget to mention that it can also be a liability. Most people don’t remember when good quality data helped them make decisions, helped them grow their business, or enabled them to beat the competition; but they sure as hell remember when it causes their business operations to cease, their reputation to be torn to ribbons and their status as a trusted entity to be shattered before their eyes.

(Sources: http://news.sky.com/story/1075720/leeds-hospitals-own-data-stopped-surgery

http://www.bbc.co.uk/news/health-22076206)

(Thanks to Tash for the alert and the excellent write up)

Did an Excel coding error destroy the economies of the Western world?

The title of this post is taken from an article by Paul Krugman (Nobel prize winning economist) in the New York Times of the 18th of April 2013. And it really is a good question that sums up the significance of the information quality problems that have emerged in an economic model which has been used to guide the actions of governments and non-governmental organisations in response to the global financial crisis.

Krugman’s article summarises the background very succinctly but we’ll summarise it again here:

  1. In 2010 two Harvard economists, who between them had served with and advised a number of governmental and supra-governmental organisations, produced a paper that argued that there was a key threshold above which Government debt became unsustainable and had a negative effect on economic growth. That threshold was 90%.
  2. That threshold was used as a key benchmark to inform policies for dealing with government debt crises in Europe and elsewhere. It became an article of faith (despite some economists questioning the causation/correlation relationship being argued). The official line being taken with countries with sovereign debt challenges was that austerity was required to reduce debt below 90% to prevent a fall off in growth – and there was academic research to prove it.
  3. However other researchers struggled to replicate the results presented in the original paper – decline in growth was never as severe and the causal relationship was never as definitive. Eventually one researcher got access to the original spreadsheet and uncovered methodological issues and fundamental calculation errors, including a formula calculating an average that left out data points (5 countries were omitted).

The reanalysis of the spreadsheet data, correcting for methodology issues and for calculation errors found no average negative growth above the 90% threshold. According to author Mike Konzcal on economics blog NextNewDeal.net:

They find “the average real GDP growth rate for countries carrying a public debt-to-GDP ratio of over 90 percent is actually 2.2 percent, not -0.1 percent as [Reinhart-Rogoff claim].” [UPDATE: To clarify, they find 2.2 percent if they include all the years, weigh by number of years, and avoid the Excel error.] Going further into the data, they are unable to find a breakpoint where growth falls quickly and significantly.

Konzcal goes on to hope that future historians will recognise that:

one of the core empirical points providing the intellectual foundation for the global move to austerity in the early 2010s was based on someone accidentally not updating a row formula in Excel.

An alternative analysis of the data presented on NextNewDeal.net also raises questions  about the causal relationship and dynamic that the original paper proposed (that high government debt causes decline in demand).

Paul Krugman has posted further updates on his NYTimes blog today.

Impact?

As with many information quality errors, the impacts of this error are often not immediate. Among was the potential impacts of this spreadsheet error and the nature of the causal dynamic are:

  • Austerity policies in Ireland, Greece, Cyprus, Italy, Portugal, Spain and other countries
  • Business failures (due to fiscal contractions in an economy reducing supply of investment finance, weaker demand, longer payment cycles etc)
  • Reduction in public services such as health care, and increases in taxation
  • Increases in Suicide in Austerity countries (e.g. Greece)

 

Conclusion

Where data and its analysis becomes an article of faith for policy or strategy it is imperative that attention be paid to the quality of the data and its analysis. In this case, opening up the data for inspection sooner might have allowed for a more timely identification of potential issues.

It also highlights the importance of careful assessment of cause and effect when looking at the relationship between two factors. This is an important lesson that Information Quality professionals can learn when it comes to figuring out the root cause of quality problems in the organisation.