DATA INTEGRITY: WHY SHOULD YOU BE AS MAD AS HELL?

In the 1976 motion picture ‘Network’, Peter Finch plays broadcaster Howard Beale who makes an impassioned speech live on air. It starts: “I don't have to tell you things are bad.” He proceeds to rant loudly and animatedly about all the problems in society and then implores the audience to: “get up right now and go to the window. Open it, and stick your head out, and yell, 'I'M AS MAD AS HELL, AND I'M NOT GOING TO TAKE THIS ANYMORE!'”

Whilst, the film did not end well for the character Howard Beale, the scene encapsulates what can happen when we accept too much that is just not right and we feel powerless to do anything about it. 

Accountants assemble information drawn from the world, translate it into codes and categories, extract this information, compare the details with other details, draw inferences, produce reports and present them to management. In turn, management makes decisions based on the information presented. Naturally, the expectation is that “the information” (and the data that underpins it) is timely and correct, ie. management information has an aura of data integrity around it. 

Unfortunately, there are well known examples where data integrity was lost somewhere between collection and decision. Think of examples like the weapons of mass destruction that Iraq did not actually have, and companies like Enron that reported revenues they never really had. Taking steps to guard against providing misleading information is a constant and difficult challenge for anyone, let alone the CFO. 

Finance has a wide range of techniques and tools to assist with the control of data integrity. This does not mean that all are done or done well. Problems can include: 

 

-  Lack of formalised control of master data

-  Accounting manuals are either non-existent, out of date, inaccessible, poorly written, or not adhered to

-  Coding guides suffer similar issues to accounting manuals

-  Closed periods force adjustments into the current open period – not in itself a bad thing – but where were these adjustments previously?

-  Once automated, does anyone remember how it does things and why it sometimes works funny?

-  Authorisations - Who checks the checkers?

-  Up front validation can cause a log jam outside the system not necessarily taken up by accruals

-  Scheduling and checklists do not always reflect upstream or downstream changes

-  Systems are rarely well integrated

-  Account reconciliations can take a long time to complete and may not be completed by suitably experienced staff  

-  Comparatives to prior periods become less readily meaningful in times of significant organisational change and M&A activity

-  Variance analysis is heavily dependent on the quality of the budget and the preparedness of the business to provide explanations against a yardstick that is out of date the day it is published

-  Audit and review is not infallible. 

 

So, whilst there is ever the possibility that data lacks integrity (even by relatively small degrees) the requirement to continuously improve the control environment must be relentlessly and assertively pursued. Otherwise, what’s the point? 

Don’t put up with it. Do something about it.

 

Related Posts:

PROCESS AND SYSTEMS DESIGN: HOW DO YOU GET TO DAMASCUS?

STANDARDISATION – IS IT BETTER TO LET PEOPLE RUN THEIR OWN RACE?

POLICY – DO YOU CREATE ROBOTS?

INNOVATION - HAVE YOU CONSIDERED THE COST OF STANDING STILL?

HOW DO YOU CLIMB A MOUNTAIN THAT LOOKS TOO TALL TO CONQUER?

 

  • 2016-07-21 13:17:30
  • Mark Spicer
  • Data integrity, Internal control environment, Continuous improvement, Marshaling of resources