Tuesday, August 21, 2007
Taking GL Analytics for a Test Drive -- Hard Way vs. Easy Way
"Dell Inc. said it would restate more than four years of its financial results, after a massive internal investigation found that unidentified senior executives and other employees manipulated company accounts to hit quarterly performance goals."
"The company [Dell] said it found evidence that various reserve and accrued-liability accounts were created or improperly adjusted -- usually at the close of the quarter to give the appearance that quarterly financial goals were met. The adjustments sometimes followed reviews of account balances "at the request of or with knowledge of senior executives." Dell added that employees in some business units purposely gave incomplete or incorrect information about these activities to headquarters personnel or auditors."
"According to the filing, the law firm Willkie Farr & Gallagher LLP and the accounting firm KPMG LLP led an investigation, using special software, that evaluated more than five million documents. They conducted 233 interviews with 146 individuals, according to the filling."
Sophisticated software can be used to analyze journal entries for suspicious patterns and potentially fraudulent transactions. Examples might include entries near month-end that subsequently reverse early in the next period, or entries that are made to accounts that normally receive only manual entries. For Dell, they tested this software the hard way -- as part of an SEC investigation and under advice from counsel.
Other companies are beginning to test similar analytic software the easy way -- as part of a routine internal audit or management review of balance sheet and related journal entries.
I'm reminded of the old saying from my pharmaceutical days -- even though medicine is often much more expensive than vitamins, fewer people actually buy the vitamins. As solution providers that often help clients with the preventive medicine of financial controls, I hope this illness causes a few more people to talk to their doctor while their health is still good.Labels: GL Analytics, KPMG, Restatement
posted by Joe Oringel @ 8:14 AM
Sunday, July 01, 2007
Why did the fraud numbers increase
Oversight's 2007 fraud survey shows a double digit increase over the 2005 survey results despite the implementation of Sarbanes Oxley controls regimens. Is there any way to reduce the reported fraud numbers? Does that mean we have to implement even more controls?
Over the past few weeks I've had the chance to discuss these results with a number of experts and have developed a consensus view that we can reduce fraud further. And more surprisingly we can do it with fewer more "rationalized controls." Thankfully the "top down risk based approach" advocated in Audit Standard 5 (AS5) gives the opening to effect this change.
Many (if not most) of the first iterations of controls for Sarbanes Oxley compliance were created with a bottoms up approach that attempted to cover every possible contingency. Think of everything that could possibly result in financial reporting fraud and then design a way to prevent it. While most control activity will reduce risks there is a finite amount of time and effort available for all the activity. Covering every possible contingency dilutes the overall fraud reduction effort by spreading effort documenting low value activities.
For instance, the physical security of tapes used to back up the financial applications in the Sarbanes Oxley controls is an example of an activity that has a relatively low fraud reduction payback for the effort invested. In order to effect the fraud someone would have to manipulate the precise fields in the back up tape to change the financial numbers and then cause the financial applications to crash and then have the systems restored from the manipulated back up tapes. Frankly restoring from a back up tape is not always the most reliable process. A lot of things have to happen in this fraudulent financial reporting scenario - it's a low probability occurence.
When you compare the back up tape scenario with a manager or other privileged user overriding controls and posting a fraudulent entry in the General Ledger (GL) it is clear that the management override is much easier to effect. Both are possible, one is much more probable and very difficult to absolutely prevent. Finding the irregular GL posting requires diligent forensic evaluation of journal entries which takes time and expertise.
The top down risk based approach advocated by regulators in would devote more effort to the journal entry evaluation and reduce the time spent on low probability risks. By rationalizing the control activity according the the real risk there's more time for the high impact activities that materially affect fraudulent financial reporting. With AS5 we have the opportunity to adapt control investments towards activities with a real pay off in fraud reduction.Labels: AS5, Fraud monitoring
posted by Patrick Taylor @ 7:38 PM
Thursday, May 24, 2007
PwC Publishes 2007 Internal Audit Survey - "Continuous Auditing continues to generate interest"
CFO Magazine published some of the results of PwC’s annual Internal Audit Survey, with the headline focusing on the fact that some internal audit functions do not comply with the IIA standard of performing an annual risk assessment. While interesting and potentially worrisome, I’m personally comfortable that some of those numbers could be overstated because Internal Audit may rely on other Company risk assessment activities (e.g. Enterprise Risk Management) as input for their annual audit plan. As solution providers for continuous auditing and continuous monitoring solutions, my partner and I focused more on the survey's status of Continuous Auditing (CA), also presented in this year’s PwC survey. Some highlights: Significantly fewer companies (11% in 2007, down from 41% in 2006) reported that their CA programs were entirely manual. The acknowledgment that automation of some type is needed as an enabler for continuous auditing is noteworthy, and we are encouraged by market recognition that automation is essential as part of an effective CA program. Also noteworthy is that slightly fewer companies (11% in 2007, vs. 13% in 2006) reported having a fully implemented continuous auditing (CA) program in place.
Perhaps we can attribute that decline to better awareness of what a real CA program may entail.
Updating one's audit plan twice a year instead of annually may satisfy a textbook definition of continuous risk assessment and thus continuous auditing. But personally, I would suggest that a "real CA" program examine TRANSACTIONS at regular intervals that approach weekly or even daily, and identifies areas of risk and needed follow-up. We see confusion and clutter in the vocabulary that describes continuous auditing and contininous monitoring today, despite numerous companies having successful CA programs in place.
This year's PwC survey shows that the audit profession is beginning to understand CA better, so I'll see that as a glass half-full.
posted by Joe Oringel @ 8:53 AM
Tuesday, April 17, 2007
Oversight Systems Financial Executive Survey Finds More Than Half of Shared Services Centers Fall Short of Operational Goals
ATLANTA (April, 2007): Shared service centers have yet to show their full potential for many companies according to the 2006 Oversight Systems Executive Report on Shared Service Centers. The national survey of financial executives found that more than half of respondents report their shared service centers (SSCs) are well short of achieving their operational goals. First implemented in the 1990s by many large enterprises, the SSC model allows companies to consolidate client-facing functions in an attempt to reduce costs. However, the report released today finds that 52 percent of executives report their SSC are only meeting half or fewer of their business goals. The free report is available to download at www.oversightsystems.com/survey. "Companies adopted shared service centers for the immediate cost savings, but executives are now struggling to continually improve their operations," said Patrick Taylor, CEO of Oversight Systems. "This survey shows that shared service centers must develop strategies and implement systems that support ongoing improvement."
Reflecting the recent development of most shared service centers, more than half of executives (59 percent) report that their SSCs have been in operation for less than five years. Regardless of the youth of the concept, companies are putting much stock into these centers. Most executives (85 percent) report their SSCs serve four or more business units with 40 percent reporting to serve 10 or more. Although the C-suite goals are clear, achieving them is often met with adversity.
The most prevalent challenges to ongoing SSC operations were maintaining continuous improvement (61 percent), skepticism from business units (59 percent), employee retention/turnover issues (43 percent), meeting customer service level agreements (26 percent) and threats of outsourcing business processes (13 percent).
The Real Measure of Performance When it comes to shared service centers there is no measure of performance more important than cost savings and that is the silver lining in this report. Nearly three-quarters of executives (73 percent) classify their SSC as “world class” or “average to above average.” As such it comes as no surprise that nearly the same number of respondents (71 percent) report having almost reached, reached or exceeded their cost savings expectations. In fact, the study found that 85 percent of executives were prompted to embrace the SSC model in an attempt to reduce and control operating costs. Although cost was the driving factor for implementing an SSC model it was not the only reason. Other reason included:
* Improve quality (69 percent) * Improve their customer focus (63 percent) * Free up resources for other purposes (49 percent) and * Improve company focus and reduce risks (34 percent).
Goals for 2006 Beyond the central goal of reducing costs, executives do have other goals for their shared service centers. Topping the list of 2006 goals with 52 percent support is to improve on service level agreements or SLAs. Other popular goals include: re-engineer business processes (51 percent), increase transaction throughput and capacity (40 percent), expand business offerings (39 percent), and reduce aggregate error rates (35 percent). Less frequently cited goals include: increasing the percentage of one-touch transactions (30 percent), implementation of Six Sigma programs (23 percent), and automation of Sarbanes-Oxley compliance (20 percent).
Regardless of the hurdles that are faced with implementation and operations of shared service centers, 97 percent of executive point to sustainable benefits of SSCs as opposed to traditional outsourcing of business processes. When compared to outsourcing, executives say SSCs offer benefits such as:
* Improved level of service and quality (81 percent) * Better responsiveness to customer demands (68 percent) * Greater flexibility in adapting to evolving business needs (62 percent) * Lower aggregate costs of operations (51 percent).
posted by Zeleon @ 1:35 PM
Philadelphia Eagle Running Back Gets Paid $3M - Twice
When most people receive an errant duplicate paycheck it means a couple thousand dollars - at most. Something that may or may not even be detected and something they may or may not report. But, when an NFL superstar gets paid twice the impact is a tad bit worse. According to an Associated Press article posted on the Fox Sport web site, titled "Eagles accidentally pay Westbrook twice," Brian Westbrook received an extra $3 million from the Philadelphia Eagles in an accounting error. The star running back intends to pay the team back after getting his roster bonus twice. However, the Eagles filed a grievance with the NFL against Westbrook because the money hasn't been repaid yet, a team spokesman said Saturday. Continuous monitoring of payroll transactions can detect potential duplicates BEFORE they leave the corporate boundaries. When you view duplicate payroll, duplicate vendor payments, unused discounts and unused credits in cummulative form, you're talking real money and real materiality.
posted by Zeleon @ 1:31 PM
Monday, April 16, 2007
Current Privileged User Monitoring Solutions Don't Leverage Lessons from the Past
I just read a ComputerWorld blog entry written by Eric Ogren regarding the need to focus "Privileged User Monitoring" on transaction and business monitoring versus the old access management model.
I could not agree with him more. If there's one thing information security professionals can tell you with confidence... it's what does not work. Things change so frequently within the IT risk domain that it's often difficult to solve a problem with certainty. But, when it comes to dealing with "trusted" users in the real world, we all know what doesn't work. What does not work is printing out long monthly list of users with "excess privileges" and expecting this to significantly reduce the risk of fraud and misuse - at least at the material levels associated with SOX and A-123. In today's world access management and provisioning is a serious manpower drain. And, when you couple this with the need to provide periodic reports identifying the issues and progress, that just adds more manpower requirements... UNLESS you shift the focus to the highest risk issues and higher impact solutions.
Printing out these monthly excess-privilege list places a huge burden on our IT and InfoSec professionals but operational realities are operational realities. Key managers still receive conflicting privileges in order to support all areas under their control. And, key managers also receive powerful privileges such as those allowing them to actually "override" existing system-based control. 99% of the time used, they're probably just doing their job and ensuring the business keeps on functioning properly. But, it's that other 1% that can result in a major failures - e.g., a privileged user modifying quarterly revenue with a simple manual journal entry to conceal a bad quarter. In this case, the user is just using an "authorized" privilege for an "unauthorized" change.
And, what about when an AP Manager creates a vendor, purchase order, invoice, and voucher as part of an ellaborate procurement fraud scheme.
Or, when a database administrator uses their root access to make modifications to a payment record just before its released through the EFT system.
All of these are real examples of high risk conditions and real-world incidents concerning trusted insider - or privileged users.
So, lets stop using the 20/80 solution model and flip things around and do the 80/20 thing. Meaning, lets stop focusing on routine user access privilege conflicts and, instead, monitor and detect the use of privileges to misuse the system or conduct fraud.
posted by Zeleon @ 5:59 PM
Thursday, April 12, 2007
Not Complying With the OFAC Can Impact Your D&O Policy
Most organizations consider OFAC compliance to be just a routine issue but the Department of Treasury means business when it comes to doing any type of business with forbidden countries, business entities, and people. And, insurance carriers are beginning to translate this into policy and payout restrictions that could have a significant impact on an unsuspecting company or individual that just happens to stumble upon a long-term OFAC violation.
The Department of Treasury is quite clear that any delays in reporting any/all dealings with OFAC entities can result in serious consequences. Is a quarterly review of the OFAC list good enough... well, you be the judge. Here's what the Department of Treasury has to say within their FAQ:
DIRECTLY FROM THE U.S. TREASURY WEB SITE:
QUESTION: At what point must an insurer check to determine whether an applicant for a policy is an SDN?
ANSWER: If you receive an application from an SDN for a policy, you are under an obligation not to issue the policy. Remember that when you are insuring someone, you are providing a service to that person. You are not allowed to provide any services to an SDN. If the SDN sends a deposit along with the application, you must block the payment. [09-10-02]
QUESTION: What should an insurer do if it discovers that a policyholder is or becomes an SDN--cancel the policy, void the policy ab initio, non-renew the policy, refuse to pay claims under the policy? Should the claim be paid under a policy issued to an SDN if the payment is to an innocent third-party (for example, the injured party in an automobile accident)?
ANSWER: The first thing an insurance company should do upon discovery of such a policy is to contact OFAC Compliance. OFAC will work with you on the specifics of the case. It is possible a license could be issued to allow the receipt of premium payments to keep the policy in force. Although it is unlikely that a payment would be licensed to an SDN, it is possible that a payment would be allowed to an innocent third party. The important thing to remember is that the policy itself is a blocked contract and all dealings with it must involve OFAC. [09-10-02]
QUESTION: How frequently is an insurer expected to scrub its databases for OFAC compliance?
ANSWER: That is up to your firm and your regulator. Remember that a critical aspect of the designation of an SDN is that the SDN's assets must be frozen immediately, before they can be removed from U.S. jurisdiction. If a firm only scrubs its database quarterly, it could be 3 months too late in freezing targeted assets. The SDN list may be updated as frequently as a few times a week or as rarely as once in six months. [09-10-02]
posted by Zeleon @ 11:13 PM
Wednesday, April 11, 2007
Katrina Fraud: FEMA and Army Corps of Engineers Ripped Off
Hurricane Katrina was truly one of the most horrific natural disasters to ever hit American soil. Thousands killed, injured, and left homeless. Unless you've experienced this type of loss first hand, it's probably meaningless to even try to fully understand the suffering these families went through.
And, during this time of emergency and national outreach to the victims, FEMA and the US Army Corps of Engineers rushed to open their wallets to the rightful victims. Billions of dollars of Federal Disaster relief funds poured into the region to help feed the hungry, put clothes on the homeless, and to shelter those without the capacity to shelter themselves. Regardless of all the stories written about either agency's preparedness levels or ability to actually respond to such a catastrophe, these agencies truly pushed the envelope of financial management and controls to put mission and operational necessity first - before bureaucracy. We applaud them for that.
But, take a look at the attached link to some of the recent stories related to the rampant fraud associated with these relief efforts. It's absolutely atrocious. And, if you want more insight, go to http://www.gao.gov/new.items/d06844t.pdf for the entire GAO report released this past summer.
What you will notice is that somewhere between $600M and $1.4B (yes, Billion) was lost just to improper payments associated with individual assistance. Think about this for a moment, $600M to $1.4B lost to just one category of risk. And, that’s not the only shocking finding. This loss represents between 10-20% of the total funds spent on individual assistance. Wow! 10-20% of all funds intended for people in need went to the lowliest types of fraudsters in the world... those that would steal from starving and homeless children so they might be able to enjoy a night at the strip club (actual case study info).
If you dig into the fraud then carried out associated with actual reconstruction and what the US Army Corps of Engineers may have been swindled out of, the cost is surely too staggering for most of us to really appreciate.
I do have one major recommendation though, in many many of the reported cases of fraud, simple continuous monitoring-based controls would have prevented the fraud.
For example: 16% of the fraud could have been prevented with a better individual assistance registration procedure. Simple monitoring-based controls that alerted FEMA of invalid social security numbers, bogus addresses, invalid registrant to address matches, and duplicative registration data amongst multiple recipients would have shut down the majority of this type of fraud.
If Oversight was in place, the impact would have been between $96M and $224M. What could FEMA have done with these funds if they had not made their way into the fraudsters hands?
posted by Zeleon @ 11:57 PM
Retired Congressman Michael Oxley blames the PCAOB for starting "all the problems" with the Sarbanes-Oxley Act
Well it seems all the pain, agony, and expense we've all experienced while implementing the requirements of "Sarbanes-Oxley" was all just a big misunderstanding. The originating law makers actually intended the process to be much easier and much more focused on a risk-based approach. Somehow, the executing agencies and audit community just misunderstood the real intent- that is according to a recently published interview with Congressman Oxley in CFO Magazine.
According to the interview with CFO Magazine, Congressman Oxley says "It was Auditing Standard No. 2 [the standard for auditing internal controls over financial reporting], promulgated by the PCAOB, that started all the problems."
He further elaborates by stating "Of the complaints you hear [about Sarbox], 99.9 percent are about 404. It was two paragraphs long, but by the time the PCAOB was done, it was 330 pages of regulations. It was far too prescriptive and [more] expensive than anyone anticipated."
Take a quick look at the attached article. It's very enlightening.
Also, you'll be pleased to note that Congressman Oxley believes the true intent of the law is only now being realized. His most encouraging quote is a resounding call for a risk based approach to risk management. For example, he is quoted by CFO Magazine stating, "the Securities and Exchange Commission proposed a risk-based assessment to better define material weakness, with more emphasis on internal audit. It adds flexibility with smaller companies. Those are common-sense proposals that I am confident will be adopted this year with a 5-0 vote, which would be a ringing endorsement of [SEC chairman Christopher] Cox's leadership and reaffirmation that the SEC and PCAOB want it to work in a more efficient manner. It will protect the investor and make regulations work to everyone's satisfaction."
Personally, I like the use of Congressman Oxley's reference to "common sense." If we were all able to define, implement, and manage our risks based on common sense and traditional ROI principals, I think we would all find it easier to embrace the true benefits of quality and compliance programs. The emphasis would shift from "what do we have to do to comply" to "what do we need to do to optimize operations and returns." Herein lies the financial controls challenge of the this decade. How do we make the shift from "a world from an auditors perspective" to "a world from an operations perspective."
posted by Zeleon @ 11:29 PM
|
|