The title of this post is taken from an article by Paul Krugman (Nobel prize winning economist) in the New York Times of the 18th of April 2013. And it really is a good question that sums up the significance of the information quality problems that have emerged in an economic model which has been used to guide the actions of governments and non-governmental organisations in response to the global financial crisis.
Krugman’s article summarises the background very succinctly but we’ll summarise it again here:
- In 2010 two Harvard economists, who between them had served with and advised a number of governmental and supra-governmental organisations, produced a paper that argued that there was a key threshold above which Government debt became unsustainable and had a negative effect on economic growth. That threshold was 90%.
- That threshold was used as a key benchmark to inform policies for dealing with government debt crises in Europe and elsewhere. It became an article of faith (despite some economists questioning the causation/correlation relationship being argued). The official line being taken with countries with sovereign debt challenges was that austerity was required to reduce debt below 90% to prevent a fall off in growth – and there was academic research to prove it.
- However other researchers struggled to replicate the results presented in the original paper – decline in growth was never as severe and the causal relationship was never as definitive. Eventually one researcher got access to the original spreadsheet and uncovered methodological issues and fundamental calculation errors, including a formula calculating an average that left out data points (5 countries were omitted).
The reanalysis of the spreadsheet data, correcting for methodology issues and for calculation errors found no average negative growth above the 90% threshold. According to author Mike Konzcal on economics blog NextNewDeal.net:
They find "the average real GDP growth rate for countries carrying a public debt-to-GDP ratio of over 90 percent is actually 2.2 percent, not -0.1 percent as [Reinhart-Rogoff claim]." [UPDATE: To clarify, they find 2.2 percent if they include all the years, weigh by number of years, and avoid the Excel error.] Going further into the data, they are unable to find a breakpoint where growth falls quickly and significantly.
Konzcal goes on to hope that future historians will recognise that:
one of the core empirical points providing the intellectual foundation for the global move to austerity in the early 2010s was based on someone accidentally not updating a row formula in Excel.
An alternative analysis of the data presented on NextNewDeal.net also raises questions about the causal relationship and dynamic that the original paper proposed (that high government debt causes decline in demand).
Paul Krugman has posted further updates on his NYTimes blog today.
As with many information quality errors, the impacts of this error are often not immediate. Among was the potential impacts of this spreadsheet error and the nature of the causal dynamic are:
- Austerity policies in Ireland, Greece, Cyprus, Italy, Portugal, Spain and other countries
- Business failures (due to fiscal contractions in an economy reducing supply of investment finance, weaker demand, longer payment cycles etc)
- Reduction in public services such as health care, and increases in taxation
- Increases in Suicide in Austerity countries (e.g. Greece)
Where data and its analysis becomes an article of faith for policy or strategy it is imperative that attention be paid to the quality of the data and its analysis. In this case, opening up the data for inspection sooner might have allowed for a more timely identification of potential issues.
It also highlights the importance of careful assessment of cause and effect when looking at the relationship between two factors. This is an important lesson that Information Quality professionals can learn when it comes to figuring out the root cause of quality problems in the organisation.
Via our eagle eyed correspondent Tash Whitaker comes this story from the UK health service:
Last month, the National Health Service took the unusual step of closing down a children’s heart surgery unit at a UK hospital, after data they had submitted showed that twice as many children and babies died in the unit than anywhere else in the UK. The UK media went went into a frenzy; people came out of the woodwork with stories about their treatment at the hospital, neglect and near death experiences in abundance.
Eleven days later and the unit is set to reopen. Turns out that there were not twice as many people dying after all, just a terminal case of data malaise. The data that the hospital submitted to the NHS was late and incomplete; in fact, 35% of the expected data was missing completely, with catastrophic results.
This particular hospital had obviously not stopped to think about the impact that bad quality data has on their business and on their customers. How many children and babies had heart surgery postponed as a result of the closure? How many may later die as a result of that postponement?
In a twist of fate, the unit was closed down only 24 hours after a High Court ruling that the hospital should keep its heart unit long term. I suspect that decision is now in jeopardy. How can the hospital’s reputation recover from something like this? Would you want your child to be operated on somewhere with a reputation for high death rates? A reputation that we know to be wrong but will no doubt stay with this hospital unit for many years to come.
The importance of data as a business asset is proclaimed regularly but we forget to mention that it can also be a liability. Most people don’t remember when good quality data helped them make decisions, helped them grow their business, or enabled them to beat the competition; but they sure as hell remember when it causes their business operations to cease, their reputation to be torn to ribbons and their status as a trusted entity to be shattered before their eyes.
(Thanks to Tash for the alert and the excellent write up)
It’s been a little while since our last post of an IQTrainwreck. That doesn’t mean that they dont’ still occur. Only this past weekend Irish national broadcaster RTE published inaccurate information about the winning numbers in the Irish National Lottery draw at the end of the broadcast. .We’d like to show you video footage of the error but, to avoid compounding the error, RTE have edited the last few seconds from the end of the recording which is available on the RTE website.
According to The Irish Times RTE blame a software error for the incorrect display of numbers, which the broadcaster was forced to correct through continuity announcements during the remainder of the evening. Apparently a software update was applied ahead of the draw on Saturday 17th December.
The Irish National Lottery has expressed concern that anything might affect the collection of the winning prize, a trivial amount of only €4.9 million but point out that there is more than one way for a person to check their lottery numbers.
There are a few lessons to learn here for Information Quality Professionals
- When you are presenting mission critical information in time-sensitive environments, it is imperative that you have any changes to process, software, or technical architecture well tested before ‘show time’.
- When you are relying on the quality of information for critical decisions it is often worthwhile to take reference data points from other sources to validate and verify the source you are using, no matter how trusted or trustworthy they may have been in the past. Trust but Verify is a good mantra
- When using data for decision making where accuracy is a “Critical to Quality” factor you should seek out the most authoritative source. Often this mean going to the real world object or source data creator (in this case the National Lottery itself) rather than relying on a normally reliable surrogate source (the National Broadcaster in this case) in case errors or defects have crept into the data which is being presented by the surrogate.
From Europe we learn of two stories with similar characteristics that tick all the boxes for classic Information Quality Trainwrecks.
From Germany we hear that due to errors in internal accounting in the recently nationalised Hypo Real Estate, the German National debt was overstated by €55 Billion (US$76 bn approx). This was doubly embarrassing for Germany as they had spent the last while criticising the accuracy of accounting by the Greek Government.
According to the Financial Post website:
In an era of austerity where their government has squabbled tirelessly for two years over a mooted €6-billion tax cut, Germans found it hard to fathom that their government was so suddenly and unexpectedly 55-billion euros better off.
The net effect of the error being found and fixed is that Germany’s Debt to GDP ratio will be 2.6% lower than previously thought.
The root cause appears to be a failure to standardise accounting practices between two banks who were being merged as part of a restructuring of the German banking system. This resulted in the missing billions being accounted for incorrectly on the balance sheet of the German government who owns the banks in question.
From Ireland we have a similar story of missing Billions. In this case a very simple accounting error resulted in monies that were loaned from one State agency (the National Treasury Management Agency) to another State Agency (the Housing Finance Agency) being accounted for by the Department of Finance in a way which resulted in €3.6billion being added to the Irish National Debt figures.
This (almost co-incidentally) resulted in a 2% misstatement of the Irish National debt. Also co-incidentally it is exactly the same figure as the Irish Government is seeking to reduce net expenditure by in its forthcoming budget.
The problem was first spotted by the NTMA in August of last year (2010) but, despite a number of emails and phone calls from the NTMA to the Department of Finance the error was not fixed until October 2011. For some reason there was a failure in the Department to recognise the error, understand the significance, or take action on it.
The Secretary General of the Department of Finance blames middle-management:
Secretary general of the department Kevin Cardiff said the error was made at “middle management” level and was never communicated up to a more senior level. He said the department was initiating an internal inquiry to examine the issue and would establish an external review to look at the systems and to put safeguards in place to ensure such mistakes were not repeated in the future.
Information Quality Professionals of course would consider looking at the SYSTEM, and part of that is the organisation culture which is in place in the Department which prevented a significant error in information from being acted upon.
Lessons to Learn:
There are a lot of lessons to learn from these stories. Among them:
- When bringing data together from different organisations, particularly when those organisations are being merged, it is important to ensure y0u review and standardise the “Information Product Specification” so that everyone knows what the standard terms, business rules, and meaning of data are in the NEW organisation and ACROSS organisational boundaries. Something as simple as knowing who has to put a value in the DEBIT column and where the corresponding CREDIT needs to be put should be clearly defined. Operational Definitions of critical concepts are essential.
- When errors are found, there needs to be clear and open channels of communication that allow the errors to be logged, assessed, and acted on where they have a material or significant effect. Organisational cultures where internal politics or historic arrogance lead managers to assume that the issue isn’t there or isn’t their problem ultimately result in the issue becoming bigger and more difficult to deal with.
- Don’t shoot the messenger. Don’t blame the knowledge worker. But ensure that there are mechanisms by which people can take accountability and responsibility. And that starts at the the top.
[UPDATE - 17 August 2012: It has been drawn to our attention that the Macroom.ie website has been redesigned since this post was written. None of the links referred to below exist on the new site. This post addresses an issue that was identified as existing on one day in 2011 but which, when we discussed with contacts in the hotel industry, we learned of similar issues where tourists arrive at a hotel believing they had a booking but had in fact booked at a different hotel of the same or similar name and URL in a different place.
The new macroom.ie website is well worth a visit and has lots of interesting information about Macroom and its surrounding localities, including my personal favourite, the Prince August factory (which I used to order moulds and metal from to make toy soldiers as a child).]
Via Twitter we came across this tale of Information Quality fun and games from the South West of Ireland.
Macroom is a popular tourist destination in Co. Cork. The local Town Council have invested in a portal website for the town Macroom.ie. One of the boasts of Macroom is that is just 45 minutes away by car from the tourist hotspot that is Killarney, with its National Park and other attractions. (Macroom itself is home to Ireland’s only Toy Soldier factory).
On Macroom.ie you can link to various hotels in the locale to book accommodation. There is just one small problem.
The Riverside Park Hotel that is linked to from this site isn’t in Macroom. It is in Wexford. Over 3 hours away by car.
From Twitter we learn that CNN is reporting that NATO is combing Tripoli looking for Colonel Gadaffi (or any of the other variant spellings of that name).
Unfortunately, if CNN are to be believed, NATO has just invaded another country to find the errant Colonel (Libya is a little further to the left people…)
Of course, this is not the first time that CNN or other media outlets have made errors with geography. Here’s the CNN map showing the location of the Queensland flooding in Australia earlier this year:
And of course, the Gawker.com took great pleasure in reporting on how Google presented the Russian invasion of the former Soviet Republic of Georgia as being an attack on some good ol’ boys in the Deep South of the United States of America (at least according to their maps)
A great resource for information on mapping and cartographical errors is The Map Room Blog
Via The Miami Herald comes a story that highlights a number of impacts of poor quality information in key processes.
In Oklahoma, schools are placed on an “improvement list” if they fail to meet standards for two consecutive years. Once on the list the school show progress in improving standards for two years before they can be taken off the list. This can have implications for funding and access to resources as well. Some Oklahoma School districts are, it is reported, concerned that they don’t make the grade against Federal requirements.
Problems with the quality of demographic data in electronic testing performed by Pearson has affected the publication of the reports against which schools are graded. These will now be available a full month late, being released in September and not August as expected. This will affect the ability of School Boards to effectively respond to their report card.
Other problems reported on top of missed deadlines include errors in printing report cards to be sent to parents
Oklahoma’s Superintendent of Schools Janet Barresi has described the impacts of poor quality data in this process as a “ripple effect” that is “imposing an unacceptable burden on school districts” and has called for Pearson’s contract to be reviewed. Pearson are engaging an independent 3rd party to help verify the accuracy and validity of the scoring data (which they are confident in).
Oklahoma is not the first State where data issues have been a problem.
- In 2010 in Florida Pearson was penalised $14.7 million, and had to ramp up staffing levels and make changes to systems as a result of problems with information quality leading to delays. The problems here related to matching of student records.
- In 2010 in Wyoming, Pearson also had to pay penalties arising from problems with the testing, ranging from data going missing to other administrative problems such as improperly calibrated protractors.
This video from the Data Quality Campaign, a US Non-Profit working to improve standards of data quality in the US Education system, highlights the value of good quality and timely information in this important sector:
It was widely reported yesterday that students in Scotland who had signed up for SMS notification of their results had received them a day early, giving them a jump on their less technically minded compatriots and competitors and causing stress and distress to students, parents, educators, civil servants, and politicians.
The Scottish Education Authorities began a root cause investigation as the secrecy and security of the examination results system seemed to have been compromised.
Right now I suspect you are settling in for a tale of hackers and Jason Bourne like derring do. Well, here at IQtrainwrecks we never get that lucky. After all, this is a blog that looks at information and data quality problems.
According to The Register the root cause of this problem is good old data interchange and exchange across organisations.
- A template spreadsheet was used to perform the data interchange between the Scottish Education Authority and the company which provides the SMS gateway and related processing as a pro-bono to the Education Authority, AQL. A batch template is used rather than an on-line interface as the service is only used once a year.
- The template was populated and saved in a later version of Microsoft Excel
- The process of populating and saving the spreadsheet appended a white space to the end of each date stamp (the date that the sms was to be sent).
- The ETL process interpreted the “DATE” field as text (which it was thanks to that errant space) and rejected the field on the load. Luckily AQL had developed error handling for situations where a date field couldn’t be loaded and applied a default… the day of the file load (which was the day before the messages were to go out).
- As a result the SMS system read the file and sent the messages a full day early.
The Irish National Lottery had an embarrassment last week when their Bank Holiday promotion draw went awry.
As part of a special draw for the August Bank Holiday weekend, the Lottery were offering a prize of a Jaguar XK convertible as an additional prize to the person who won the jackpot.
Unfortunately, due to apparent “human error” the National Lottery Company informed anyone who checked their numbers on-line and had matched any combination of numbers that they had won the car, even if the money value of the prize was as little as €5.00. They hadn’t, but the story still made headline news. Some outlets report that disgruntled non-winners are considering legal action.
It is important to have validation checks in place on reports and publication of data, particularly where that data would be of value or could be relied upon to the detriment of another person.
We spotted this on Gawker.com. From my experience using Google Maps, it rings true (I recently was sent 15 miles out of my way on a trip in rural Ireland).
It seems that Google Maps has plotted the location of a tourist attraction in New Jersey right at the end of a driveway to a private residence. So, on the 4th of July weekend, the owners of the property had to fend off increasingly irate visitors who were looking for the lake and wound up in a private driveway.
So, the data is inaccurate and of poor quality. Google have responded to their error and replotted the location of the tourist area at the lake? Not yet, according to the story on Gawker.