//

InstantCarma

By Rich Lanza, CPA, CFE, CGMA

The term Instant Karma is the bringing of immediate accountability for
ones actions.  Doesn’t that phrase exemplify our objectives of
continuous monitoring which starts with risk management using a series of
risk planning analytics followed by designing automated alarms and on-site
assistance, now only focused on the top areas of concern. 

Part One of the system is the
automation of risk planning, although this egg may need to start as the
chicken at times.  Let me explain: Until specific business processes
and risks are managed with analytics so deviations are detected, it is
difficult to know:

  • Which alarms to build to manage the quality of that process,
  • When on-site testing is required, or more efficiently,
  • When would a GRC alert (with a secondary Email) do well enough for the
    results distribution to the process owner for comment.

Regardless whether they ever run a specific business process report, the
risk manager, can still apply analytic risk management, and the trick is
to think more “general”ly with the ledger data at hand.  For example:

  • Build a trial balance by month for trending account activity over time
  • Visualize associated change between financial account types (revenue,
    expense, etc.)
  • Identify material unique and recurring entries to understand top
    unique patterns and volume trends
  • Locate new accounts never used to date and their materiality in the
    current period
  • Summarize trends by solely focusing on the text usage in the
    description fields

What aids general ledger system reviews is that data is frequently
maintained at the detail transactional level so once a trend is
identified, the summary visualization can be drilled-down into the
detailed transactions instantly.

Part Two of the continuous
analytic engine are specifically built alarms which serve the purpose of
gaining feedback on root causes, while also providing additional detailed
design specifications for the  next alarm development.  The system
should get faster and smarter each time it runs so the quicker the process
of automating alarm response, the faster the analytics transform around the
process.  Manually developed Emails and on-site visits can start with
automated updates to a results manager system, personalized for each user,
with Email reminders for lack of response.  The faster the responses
are generated and trended themselves, the faster change can happen within
the process along with the design of the next best alarm for that process.

The goal is to turn false positives/negative reports into ones that directly
find the issue, thereby meeting the report’s objective as quickly as
possible.  Such tweaks in the process constantly change over time as
the process improves in their risk management.  For example, a matching
of the vendor information to governmental “watch lists” could start with an
address and name match and quickly expand to a match on close approximations
of the name, address, geolocation of zip codes, and then, once the business
process owner decides to enter TIN information for each vendor, a TIN match
to a government funded TIN matching service. 

The last unmentioned part of any instant CARMA system surrounds the process
and is the consistent execution of such analytics.  Only through the
collection and analysis of data points at consistent intervals can the
organization and automated system continuously “learn” how to adapt itself
to the process.  Further, the risk manager can continuously run
business process scoring by trending the now validated alarms.  Through
trending of the alarms and business owner risk responses can the risk
manager identify which departments and locations are more ripe for an
on-site review, or at the very least, an online conference meeting.

To learn more on dealing instant C.A.R.M.A., please see my AuditNet(r)
minutes to analytics webinar on risk planning scheduled for May 11th (http://bit.ly/1WPvQgW)
followed by a complimentary webinar on June 8th on automating specific
control reports. (http://bit.ly/1MQYIDa).

Rich Lanza CPA, CFE, CGMA (www.richlanza.com) has over 25 years of audit
and fraud detection experience with specialization in data analytics,
business process diagnostics and cost recovery efforts. Rich wrote the
first book on practical applications of using data analytics in an audit
environment titled, 101 ACL Applications: A Toolkit for Today’s Auditor,
in addition to writing over 19 publications, and over 75 articles. Rich is
proficient and consults in the practical use of analytic software
including ACL, ActiveData for Excel, Arbutus Analyzer, IDEA, TeamMate
Analytics and auditing with Microsoft Excel techniques. Rich has been
awarded by the Association of Certified Fraud Examiners for his research
on proactive fraud reporting. He is also a regular presenter for CFO.com,
the Institute of Internal Auditors, Association of Certified Fraud
Examiners, Auditnet ® and Lorman. Rich consults with companies ranging in
size of $30 million to $100 billion and in all, has helped them find money
through the use of technology and recovery auditing. 
He is also a current faculty member with the International
Institute for Analytics.


Per the Association of Certified Fraud Examiners recent Report to the Nation, the
median fraud is a mere $6,000 different between large and small business
but that tells only a small part of the story.  While a small
difference in dollar terms, small business fraud occurs at an average rate
of $1,540 per employee while large business sustained a $16 per employee
effect.  This is nearly a 100x multiple between the effect of median
fraud on small and large businesses.  It also must be noted that
small businesses simply do not have the people, processes or technology to
protect themselves as large companies do which leads to even further
losses.  What they need is affordable protection…… Read
more
.

As I prepare for an upcoming webinar on March 23rd and a live presentation in Williamsburg, Virginia in April 2016, I realize that auditors have an inherent need to select samples of a population, or do they? Sampling is done because a professional cannot gather data from the entire population, but what if you the auditor could for a given business process?

For any computer enthusiast, I suggest watching the Imitation Game movie where they explain how to stop a code device named Enigma which changed its code every 20 minutes:

There are 159 million, million, million possible Enigma settings. All we had to do was try each one. But if we had 10 people checking one setting a minute for 24 hours every day and seven days every week, how many days do you think it would take to check each of the settings? Well, it’s not days; it’s years. It’s 20 million years. To stop an incoming attack, we would have to check 20 million years’ worth of settings in 20 minutes.

Now, no one is expecting auditors to manually test every transaction. Understandably, it is much easier to pick a small sample and rely on statistics to extrapolate error rates. However as in the movie / real life (since the movie is based on a true story), shouldn’t we also realize that we as auditors need a computerized machine to perform the calculations. In our current age of “Big Data” and advancement of PCs, why would we still rely on such antiquated approaches of manual sampling. It is almost as if we use sampling as a more simplistic means of testing a business process, even if a better solution is staring us right in the face.

That solution is to audit as much of the business process data with analytics, otherwise known as 100% auditing. From specific report testing segregation of duties to data mining analysis allowing auditors to visualize an understanding of the financial accounts, it is not a far stretch to imagine at least half of the current procedures where sampling is applied could be turned into an analytic. Please note that many times a business process may not have computer readable data but isn’t that an audit finding in itself? Shouldn’t as much of the business process be reduced to a digitized searchable dataset rather than relying on paper copies of revenue and expense transactions?

We have to expect that the first year of 100% auditing may be a bit rough as we would be identifying a variety of false positives, as well as, legitimate fixes to our business processes. I explain in a separate article how to work to remove false positives for those interested (http://bit.ly/1Pdz3Df ). Aside from the findings identified for review, we would need to first establish the data methods to extract, normalize, and then filter to only those transactions associated with our control and financial balance testing. This is a year-one data and report investment but it pays dividends in all future years.

Analytic-enabled testing can be completed in seconds and can be scheduled to be performed on a recurring basis. This is much faster than any sampling approach and, as in the Imitation Game, requires hardly any human resources to perform the tests. Rather, the “humans” can be focused on the exceptions of such reporting to help cure the business process by performing root cause analysis. Wouldn’t it be better for us to focus more on the solution than on testing whether we have a problem? Can’t we leave those menial tests for the computer?

For more information on the two CPE webinar entitled “Improving Fraud and Audit Sampling Productivity” where we will explore sampling and improved ways to 100% test populations, please click here (http://bit.ly/1La5d2R) or looking forward to seeing everyone at the April 2016 Williamsburg, Virginia Fraud Conference (http://bit.ly/1XB78id).

Preventing Fraud in Accounts Payable: Trend Talk by Rich Lanza

In November, 2015, I delivered a 60-minute presentation on fraud and
accounts payable at the London, UK Basware Trend Talks event. 
Basware is the global leader in procure to pay and e-invoicing technology. 
Please see below link to the video and the below learning objectives of
this Trend Talk:

  • Understand the opportunities and threats that alternative financing schemes, such as supply chain finance and early payment discounts, present
  • Learn how you are susceptible to fraud, and how to scan your team for the signs
  • Discover why the vendor account presents one of the easiest ways to steal from the company, and how to identify different types of fraud across the procure-to-pay process
  • See how report surveillance has been shown to reduce fraud by 66% while dropping detection timeframes by 50%
  • Hear how cost-cutting reviews can become an effective defense against fraud
  • Walk away with new methods of detecting errors and fraud

This presentation is a summary of what my firm, Audit Software Professions, provides to clients through the Business Process Diagnostic Reporting service which is a data analytic service applied to core business cycles focusing on results: improving operations, identifying cash savings and detecting anomalies such as errors, fraud and inefficiencies.

Click Here to View the Taped Video Presentation

Click Here to Learn More About the Business Process Diagnostic Reporting Service

Turning Analytics From a “Nice to Have” to a “Must Have” in 2016

By Richard B. Lanza

I wanted to start this article with the following quote which I wrote in the past and believe to this day:

In my experience, as someone who teaches accountants to use audit software, the hardest step is the first one: making the decision to try an audit program. Once CPAs make that decision, learning how to use the application is relatively easy. From then on, the learning curve is swift as the auditor finds shortcuts for even more efficiency and discovers ways to make the audit process even more effective.

Ironically, it was from an article I wrote for the AICPA’s Journal of Accountancy in 1998 where I explained a variety of simple analytic techniques that anyone could apply yet for many of us, such usage still eludes our day-to-day auditing practices. For instance:

  1. 34% of auditors are required to use data analytics on audits with another 42% of them using such tools on an ad-hoc basis (2015 AuditNet® Data Analytics Use By Auditors)
  2. 47% of internal auditors either do not use or minimally use analytics with only 19% using them extensively (IIA Research Foundation: Staying a Step Ahead – Internal Audit’s Use of Technology 2015).
  3. 9% have continuous monitoring fully deployed, with 60% using data analytics informally or less (2012 AuditNet® Survey Report on Data Analysis Software)
  4. 90% of those who employ continuous monitoring report that they monitoring is focused on specific enterprise areas where there are known risks with less than 50% monitoring key risk indicators and less than 50% monitoring fraud risk (Protiviti’s Changing Trends in Internal Audit and Advanced Analytics).
  5. While 82% of CAEs say they leverage data analytics in some specific audits, just 48% use it for scoping decisions and only 43% leverage data to inform their risk assessment (2015 PwC State of the Internal Audit Profession).

So when we look to the reasons why usage is not optimal, we are met with a variety of inter-related truths:

  1. 48% of auditors are not trained in analytics with nearly 25% more not having the time to learn the software (2015 AuditNet® Data Analytics Use By Auditors).
  2. Globally, only 1 out of 10 internal auditors had education in information systems or computer science (IIA Research Foundation: Staying a Step Ahead – Internal Audit’s Use of Technology 2015).
  3. Roughly 80% of audit teams do not link the use of analytics to their job-related objectives (2015 AuditNet® Data Analytics Use By Auditors)
  4. Current external audit standards are rooted in traditional audit processes where sample vs. the entire body of data is still the focus (http://tinyurl.com/hpprm3e).

In our upcoming complimentary Minutes to Analytics 2 CPE webinar Feb 10, 2016 – Turning Analytics From “Nice to Have” to “Must Have” , Jim Kaplan of AuditNet® and I will explore this topic and provide a series of next steps from best practice organizations who have succeeded with analytics. In summary, we will at a minimum go over the following:

  • Identifying any fear of automation early in the process and ways to overcome such obstacles with team members.
  • Using new approaches on training that focus more on just-in-time approaches with specific and practical exercises.
  • Building a team of analytically-enabled auditors instead of relying on select champions with time constraints.
  • Identifying the “low hanging fruit” areas to begin your analytic efforts as you move to a more continuous approach.
  • Where Big Data can go from being just a buzzword to a key component of your 2016 testing plan.
  • The use of script libraries to automate your testing while using “the power of prototypes”.
  • How automation of the data access steps, namely the ETL (Extraction, Transformation and Loading Process), can help ensure a continuous audit process.
  • How year one investments in analytics can lead to a perpetual cash machine for the organization in all subsequent years.

We look forward to seeing you at our first 2016 training event as we embark on our fifth year of training auditors with analytics. Please click on this link to learn more and register for our complimentary 2 CPE training:Feb 10, 2016 – Turning Analytics From “Nice to Have” to “Must Have”