There are two key themes in the revised APRA EFS requirements – increase in granularity of data sought and alignment of reporting requirements on size and complexity of the organisation.

APRA is seeking data at a more granular level such as contract and transaction level details, which calls for considerable investments in systems and processes on the part of ADIs.

While the overall number of forms required to be submitted have reduced; introduction of new industry, sectoral classifications and data quality framework coupled with the increased granularity will increase the reporting burden, especially for the larger ADIs.

Applicability for different types of institutions:

a.Bank ADIs: 10 reporting forms are mandatory and 11 reporting forms are conditional upon meeting the following thresholds – Repo & Security lending >= A$1b (1 report), Gross derivatives position >= A$ 1b (1 report), Margin lending >= A$150m (1 report), Total assets >= A$5b (1 report), Total assets >= A$10b (1 report), Business Credit >= A$2b (2 reports), Housing Credit >= A$6b (2 reports), Personal Credit >= A$ 500m (2 reports).

b.Non-bank ADIs: 2 reporting forms are mandatory and 13 reporting forms are conditional upon meeting the following thresholds – Total assets >= A$200m (3 reports), Repo & Security lending >= A$1b (1 report), Gross derivatives position >= A$ 1b (1 report), Margin lending >= A$150m (1 report), Total assets >= A$5b (1 report), Business Credit >= A$2b (2 reports), Housing Credit >= A$6b (2 reports), Personal Credit >= A$ 500m (2 reports).

c.RFCs: 2 reporting forms are mandatory and 18 reporting forms are conditional upon meeting the following thresholds – Total assets >= A$50m (3 reports), Total assets >= A$500m (6 reports), Repo & Security lending >= A$1b (1 report), Gross derivatives position >= A$ 1b (1 report), Margin lending >= A$150m (1 report), Business Credit >= A$2b (2 reports), Housing Credit >= A$6b (2 reports), Personal Credit >= A$ 500m (2 reports).

In addition to the reforms in EFS reporting, APRA is in the process of revamping many other supervisory reporting standards including

a.Revised liquidity reporting framework

b.Introduction of NSFR reporting mandate

c.Introduction of Standardised Approach for Counterparty Credit Risk management (SA-CCR)

d.Revisions to Large Exposure framework

e.Revisions to minimum ADI capital requirements (BASEL III final components)

f.Capital adequacy revisions

g.Introduction of Leverage Ratio

Increase in data relevancy, data granularity and data quality emerge as the core pillars across all these regulatory reforms ahead.

Apart from the monetary investment in upgrading the technology infrastructure, compliance with the prescribed data quality and accuracy frameworks will require re-alignment of processes to enable collection of data from disparate source systems.

This may involve considerable re-engineering of the current practices of recording and consolidating data. Moreover, given the prescribed data accuracy thresholds, a shift towards automation of processes has never been more critical.

EFS implementation timelines



 Report classification

 Form code

 First reporting period


Balance sheets

ARF 720.0A, ARF 720.0B, ARF 720.1A, ARF 720.1B, ARF 720.2A, ARF 720.2B, ARF 720.3, ARF 720.4, ARF 720.5, ARF 720.6, ARF 720.7

March 2019


Business finance

ARF 741.0

July 2019

Business interest rates

ARF 742.0A/B

Household finance

ARF 743.0, ARF 745.0

Household interest rates

ARF 744.0A/B, ARF 746.0A/B

Lending and funding statistics

ARF 747.0A/B, ARF 748.0A/B


Balance sheets

ARF 721.0A/B, ARF 723.0

September 2019

ARF 722.0



ARF 730.0

September 2019

ARF 730.1

June 2020


Data Preparation

New EFS reports have been configured to ensure both consistency in data definitions across reporting forms and reconciliation of data reported at the lowest level of granularity (sought in one report) with their aggregated and GL level balances (sought in another report).

To this effect, the data quality frameworks set out in RPG 702.0 ensure that the reported data conforms to the stipulated accuracy thresholds.

Conducting a thorough data gap analysis with the new requirements could be the first step towards building a robust data acquisition and alignment strategy for the solution.

Presence of disparate core banking systems warrant use of consistent data dictionaries across source systems along with end to end data trail and audit mechanisms to satiate the data quality and accuracy thresholds as laid out by APRA.

Cloud and Information Security

The cloud infrastructure has evolved to provide subscribers a fast, reliable, safe and scalable platform in a cost-effective manner. Unless you have extensive customization requirements, a cloud-based solution provides seamless experience without having to worry about hardware sizing and procurement, long and expensive implementation timelines and the likes.

Argus has partnered with IBM to provide an extremely secure environment that ensures safety of your data and prevents malicious attacks. Every client will have a unique URL and is redirected via a secure web gateway that can detect any impersonation, fraudulent access etc. The entire installation will have a second firewall placed to prevent accesses from potentially dangerous IP zones as well. In the further interest of security, we ensure that a separate instance of our market leading platform is provisioned for each client – that eliminates any chance of data leaks.

The Argus cloud is built to have an inbuilt business continuity. The cloud platform provides for the ability to have a mirror environment brought up in quick time. Further, pre-scheduled weekly/monthly backups are created to be able to restore to a “point in time”, anytime!

To ensure complete compliance to local regulatory laws and data privacy, the cloud installation of this solution is hosted locally, within Australia. No data leaves or moves out of the geographic boundaries of the country!

Multi-factor authentication is part of our product roadmap

We use ‘sftp’ protocol which does data encryption during transmission and ‘https’ to ensure data is secured during application access. Additionally, we store the data on a standard RDBMS, eliminating the need for any additional encryption.

The subscriber can download the data via reports/screens that the application provides within the notice period of termination. Post discontinuation of service, both the data as well as the application instance hosted for the bank will be destroyed.


Our solution has an in-built ‘Data Profiler’ tool which can be used to detect possible issues with data at the time of loading. This tool is equipped with mathematical, logical, and relational operators using which users can configure and run various checks on data at the time of loading. The checks include but are not limited to checks on variances w.r.t. data from a previous upload date.

Our solution is equipped with an n-level drill-down framework which allows easy navigation from reported fields to their driving components. The solution offers the following features:

a.Direct one click drill-down from any cell on the report

b.Drill-down on a specific report line item to access all the underlying granular data

c.Drill-down at Rule Level, Account Level, Transaction Level and GL Level

d.Drill-down and download the underlying data tables, where applicable

Export downloaded underlying data tables to csv files.

Yes, our solution allows users to adjust data before report submission. Users can directly adjust the data-point being reported or they can adjust various attributes at the contract level. Our solution also allows users to adjust GL balances to overcome data quality issues at the source systems. Users can capture comments against adjustments made and use an approval workflow for review. The solution maintains full audit trail for all adjustments made and generates a summary report which can be used for review.

Our solution has a user-friendly interface with in-built mathematical, logical, and relational operators using which users can set up validation checks between report line items, both within and across reports. Specifically, users can perform the following functions using this utility:

a.Set up validation checks to check for mathematical and logical relationships between data points

b.Configure actions ranging from raising alerts to stopping submissions in case of errors

c.Track and stop submissions based on pre-defined Standard Deviation and Variance levels

Our solution has an in-built ‘Review Process Flow’ engine which provides a GUI with a drag-and-drop tool for creating any number of user-defined workflows. The workflows are completely parameterizable with no limit on the number of levels in the approval process. Workflows, once created, can also be easily modified to include new users, user levels, and even to change the functions performed by any roles. The Review Process Flow engine generates alerts which allows makers and checkers to track and monitor reports and complete approval process within pre-defined timelines.

Our solution provides a user interface for managing user creation, role creation, role assignment, and privilege assignment for accessing various reports and artefacts.


All banks who subscribe to Argus’ cloud-based APRA reporting service will receive a walkthrough of Argus’ Data Reception structure. Banks will then need to use Argus’ Data Integration layer to load the mapped data extracts from their source systems into Argus’ Reception Area. Banks will also receive a Functional Specifications Document providing details about a comprehensive set of pre-built data profiles on internal reporting taxonomies (derived dimensions) covering all reporting fields. Banks will then need to configure business conditions as applicable by leveraging the input reporting taxonomies and load the configured data into Argus’ Data Mart for consumption by pre-built reporting rules. The solution will automatically generate all the applicable templates for the reporting date.

A typical implementation timeframe for a reporting standard like APRA EFS will be 9 to 12 months (for the completion of all 3 phases), after the subscriber has augmented their source systems with missing data fields.

The following are the critical success factors to comply with the stated timelines:

a.Conducting a data gap analysis and augmenting the source systems with missing data fields

b.Being able to robustly map data extracts obtained from various source systems into reporting taxonomies

c.Starting early

Banks can validate system generated returns in the following ways:

a.Using configurable rules for reconciling both within and across reports

b.Using report drill-down feature to review details at the most granular level

c.Using system-generated worksheets which provide step-by-step computation details for exposures, collateral apportionment, and facility apportionment

Post implementation

Argus will ensure there are no errors and/ or omissions resulting from incorrect business logic used for creating pre-defined internal reporting taxonomies as well as incorrect pre-built rules used for generating the returns. However, the subscriber shall have checks and balances in place to avoid any errors and/ or omissions resulting from poor data quality, incorrect data mapping, incorrect configuration of business rules etc.

Argus constantly monitors updates or changes released by regulators and updates its pre-defined internal reporting taxonomies as well as pre-wired rules for report generation. All such product upgrades driven by regulatory changes will be released to Argus’ clients independent of software upgrades/ releases.

With the Argus’ cloud-based APRA reporting service, each customer will be covered as part of our Annual Maintenance Agreement with pre-defined SLAs covering fixes for defects in the core platform and solution features and access to updates and upgrades, as available. Such a support will be available as per the business hours & calendar in Australia.


The Argus’ cloud-based APRA reporting service follows a self-explanatory annual pricing model which shall be provided basis the definition of project scope, size and scale of operations, number of users and any other specific determinant (as per the bank). The applicable annual pricing is inclusive of the Software License & Annual Support.