//

Improving Sample Hit Rates With Analytics

Mar 21, 2016

by

As I prepare for an upcoming webinar on March 23rd and a live presentation in Williamsburg, Virginia in April 2016, I realize that auditors have an inherent need to select samples of a population, or do they? Sampling is done because a professional cannot gather data from the entire population, but what if you the auditor could for a given business process?

For any computer enthusiast, I suggest watching the Imitation Game movie where they explain how to stop a code device named Enigma which changed its code every 20 minutes:

There are 159 million, million, million possible Enigma settings. All we had to do was try each one. But if we had 10 people checking one setting a minute for 24 hours every day and seven days every week, how many days do you think it would take to check each of the settings? Well, it’s not days; it’s years. It’s 20 million years. To stop an incoming attack, we would have to check 20 million years’ worth of settings in 20 minutes.

Now, no one is expecting auditors to manually test every transaction. Understandably, it is much easier to pick a small sample and rely on statistics to extrapolate error rates. However as in the movie / real life (since the movie is based on a true story), shouldn’t we also realize that we as auditors need a computerized machine to perform the calculations. In our current age of “Big Data” and advancement of PCs, why would we still rely on such antiquated approaches of manual sampling. It is almost as if we use sampling as a more simplistic means of testing a business process, even if a better solution is staring us right in the face.

That solution is to audit as much of the business process data with analytics, otherwise known as 100% auditing. From specific report testing segregation of duties to data mining analysis allowing auditors to visualize an understanding of the financial accounts, it is not a far stretch to imagine at least half of the current procedures where sampling is applied could be turned into an analytic. Please note that many times a business process may not have computer readable data but isn’t that an audit finding in itself? Shouldn’t as much of the business process be reduced to a digitized searchable dataset rather than relying on paper copies of revenue and expense transactions?

We have to expect that the first year of 100% auditing may be a bit rough as we would be identifying a variety of false positives, as well as, legitimate fixes to our business processes. I explain in a separate article how to work to remove false positives for those interested (http://bit.ly/1Pdz3Df ). Aside from the findings identified for review, we would need to first establish the data methods to extract, normalize, and then filter to only those transactions associated with our control and financial balance testing. This is a year-one data and report investment but it pays dividends in all future years.

Analytic-enabled testing can be completed in seconds and can be scheduled to be performed on a recurring basis. This is much faster than any sampling approach and, as in the Imitation Game, requires hardly any human resources to perform the tests. Rather, the “humans” can be focused on the exceptions of such reporting to help cure the business process by performing root cause analysis. Wouldn’t it be better for us to focus more on the solution than on testing whether we have a problem? Can’t we leave those menial tests for the computer?

For more information on the two CPE webinar entitled “Improving Fraud and Audit Sampling Productivity” where we will explore sampling and improved ways to 100% test populations, please click here (http://bit.ly/1La5d2R) or looking forward to seeing everyone at the April 2016 Williamsburg, Virginia Fraud Conference (http://bit.ly/1XB78id).