Should the nature of Internal Audit Services under Advisory and Assurance be unlimited, or should the type of methodologies be uncapped too? Scratching your heads? Let’s break it down then.

Nature of Internal Audit services do not have a bar, especially under Advisory category, simply because an entity might benefit from internal audit expertise in any area and in any manner, it deems appropriate.

However, the methodologies which are the blueprint for execution of engagements are usually programmed when being driven by standard work approaches under the work programs. And this should not be the case. The methodologies need not conform to any specifics or more appropriately shouldn’t be typified!

It is when the methodologies aren’t typified that we could be open to advisory opportunities stemming out from assurance engagements; not just for the commercial benefit of it as is the case with outsourced IA service providers but also for a more wholesome experience in getting the systems and processes improved.

It’s also when the methodologies aren’t standardized that we get to answer the evolving needs of an in-progress engagement and modify our approach as we go. Such evolving needs could be in the form of new information, new relationships or patterns within data, the red flags, non-related data, availability of more credible and easier to obtain and analyze data, etc. With a program methodology that remains rigidly in place, the benefits will remain elusive, and the auditor’s creativity capped.

So, what are the methodologies anyway, what these are typically and how these are to be worked out so that they don’t just fulfill the objectives of an engagement but redefine them?

A methodology is means to an end in the context of the audit engagement outcome. It’s a ‘CDDD’ of an activity which in our case is the internal audit activity or an internal audit service or simply an internal audit engagement. The ‘CDDD’ is the approach we Conceive, Design, Develop and Deploy in planning to executing to finalizing to reporting an audit engagement.

For the already acquainted mind, yes, it’s true, methodologies do need to encompass the reporting as well, because without that happening, a diligently executed engagement would not be able to deliver the objectives it was designed to achieve, since the reporting could be sub-optimal or even irrelevant. Reporting should also evolve if the engagement approach evolved from the point it took off!

So, let’s now move on to understanding methodologies in the context of different types of internal audit engagements / services and how these shouldn’t subscribe to a particular understanding. 

IA Engagement / Service

What a typical methodology would look like?

What an agile methodology could be?

Compliance Engagement example: is the payment credit period policy is being adhered to?

Obtaining payment data for the engagement period.

Reading the credit policy and extracting points for use as criteria to test the data.

Working out the verification schedule from receipt of invoice to its verification to recording of payable and eventually to its payment.

Computing differences between dates and adding all these to determine the total time consumed for each case.

Comparing the computed total time to credit period time.

Determining degree of compliance and reporting.

The following could be added:

Credit policy review required: it is not aligned with the working capital requirements and costs.

The invoice receipt mechanism is deficient in recording correct dates.

Data obtained for the period is incomplete, such that the data being operationally monitored excludes a certain population from scrutiny by design, for instance by recording receipt of invoice at a very later or a very early stage.

 

AND MORE!

Assurance Engagement example: are all the controls over procurement transactions working as intended?

Identifying all control checkpoints in the process from the procedure and recording these as criteria.

Obtaining all procurement orders issued during the period.

Selecting a sample for testing the application of controls and reviewing these.

Projecting the results, determining assurance results and reporting.

The following could be found:

Controls are performing as intended but could be revamped / replaced / improved.

Controls are not performing as intended but there’s an adequate reason for it; maybe, a design flaw and that risks have been managed.

Process or a part thereof is redundant.

Controls are performing but do not meet the control objectives.

The procurement data being operationally monitored for compliance is skewed such that only purchase orders issued are examined, ignoring requisitions raised.

 

AND MORE!

Analytics Engagement example: can we identify the topmost and the bottom most cost drivers?

Identifying and obtaining relevant data.

Designing monitoring and evaluating metrics

Mapping the data to the evaluating metrics, recording inferences and reporting.

The following could be considered:

Access controls over the application carrying the data are non-existent or in-effective.

Data input controls are flawed, such that data doesn’t qualify for analyses.

Data processing controls are deficient such that processed data is inaccurate.

A lot of other reports could be developed using, for instance, SQL that could be more effective and efficient in routine monitoring.

 

AND MORE!

(Advisory) Process Audit Engagement example: review of the customer / sales order processing for improvement

Studying the process from receipt of order to delivery and performing a walkthrough to confirm understanding.

Review of the relevant procedures governing the process.

Identifying process deficiencies, alignment with procedures and procedural deficiencies.

Recording findings and reporting.

The following could be checked:

Criteria for regular monitoring is deficient such that it doesn’t tell if control objectives are being fulfilled or not, necessitating the need for a regular assurance engagement rather than advisory.

What needs to be monitored is not identified.

Further avenues / sales channels could be explored for instance to decrease the processing cost per order.

More methods to obtain sales orders could be introduced for identifying most efficient sales order types.

Marketing and promotion could be reviewed and aligned to enhance customer knowledge.

Production and stocks availability could be linked to ordering.

 

AND MAYBE MORE! 

Risk Management / Assessment Engagement example: review of the risks in payroll processing

Reviewing the payroll risk register.

Reviewing the process identifying the complete process trail and determine whether risk register is complete.

Reviewing the governing procedures.

Performing walkthrough testing to confirm all reviews and alignment.

Recording findings and reporting.

The following could also be considered:

A documented risk might not be a risk.

Redundant control processes.

Heat map needs revision as the residual risk assessment might be inaccurate.

Issues in integration with other associated processes.

Regular controls testing not in place.

 

AND MORE!

You think of more, you continue enriching your methodologies. And the engagement objectives continue evolving in tandem with the methodologies. You begin with one objective in mind and end up having a lot more that would allow you to accomplish so much more!

 

That’s why engagement work programs should never be considered anything over and above the minimum possible guidelines that should be followed or work that should be done to achieve adherence with the minimum standard. It would just be a fraction of the overall methodology.

Reason is simple: don’t codify your approach if you don’t intend to be an afterthought when it comes to improvement!