top of page

Insights

Analytical model development and QA

By

Stewart Williams

November 9, 2020

From risk management through to specification, here are the 10 most important things organisations should consider when putting an analytical model development and QA process in place.

While talking to one of my colleagues recently, she challenged me to put down in a mind map all the things I thought organisations needed to consider when performing analytical modelling QA.


I accepted the challenge, and my resulting mind map was detailed to say the least, covering many aspects of analytical model development, governance and use.


I’ve distilled these down into the ten most important components that organisations need to manage when considering how they can put a rigorous analytical model development and QA process in place.


Analytical model development, governance and use: 10 best practice components

1. Model Risk Management

The US Federal Reserve’s ‘SR 11-7: Guidance on Model Risk Management’ defines model risk as “the potential for adverse consequences from decisions based on incorrect or misused model outputs and reports”. It’s a good definition.


While sophisticated processes are already in place in the Finance Sector, model risk must be formally and proactively managed by any organisation that relies on analytical models in decision-making.


The level of Quality Assurance required for a model depends on both the level of model risk and the complexity of the model.


2. Governance

To effectively implement analytical model risk management, processes for model governance are essential.


Roles, responsibilities, and standards for model development and assurance processes should be well defined (at both the organisational and project levels).  Each model should have a designated ‘owner’ who takes responsibility for it throughout its lifecycle and has the task of confirming it is fit for purpose.


Organisations should understand the overall level of model risk they face.  A model risk register can be used to provide an overview of models in terms of their risk and their complexity, documenting assurance activities and helping auditors ensure processes are followed.


3. Model Requirements

Model requirements represent the model from the perspective of the user, or the organisation as a whole.


Most models have some level of documentation for their functional requirements, whether these be data requirements, technical requirements or stipulations as to how a business process might be supported.


While often neglected, this stage must also encompass non-functional requirements, e.g. usability, interoperability (with other software, systems, models), security, understandability/transparency, maintainability.


The requirements stage must also stipulate the delivery timescales – often it is better to have a simple model delivered quickly than a complex model later.


4. Model Specification

The Model Specification represents the model from the perspective of the modeller/developer.


In this step it’s important to answer the fundamental questions about the model itself:


  • What assumptions can be made that are valid (given the role of the model)?

  • What calculations/algorithms are to be applied?

  • How is data to be structures/stored?

  • What should the user interface look like?

  • What outputs/reporting functionality is required?


These specification items need to be linked to model requirements, enabling traceability and facilitating validation and verification processes.


5. Design and Development

In this stage you need to define in detail exactly how the model will be put together. You need to:


  • Identify the development lifecycle best suited to the requirement (e.g. waterfall, phased, agile)

  • Decide if you are using a prototyping approach

  • Identify off-the-shelf packages that could be used to make development easier (and find out what verification and validation has been performed on the packages)

  • Agree the coding standards to be followed

  • Identify if testing be automated

  • Agree the design documentation process


To future proof the QA process, design documentation should allow someone else (with the necessary skills) to take over development/maintenance of the model and provide traceable routes back to the model specification.


6. Verification and Validation

Verification is the process of determining that a model implementation accurately represents the model specification.


Validation can be considered from two different perspectives:


  • A software validation perspective that assesses whether the model meets the documented requirements

  • A real-world validation perspective that determines the degree to which the model is an accurate reflection of the real world, taking account of the intended uses of the model


The verification and validation stage should address both functional and non-functional requirements.


7. Model Deployment

How a model is to be eventually deployed needs to be considered very early on in the requirements stage. If a model is for internal use within an organisation, this may be fairly straightforward.


Consideration should be given to IT permissions necessary to install the model in first place, what tools need to be on the user’s PC, etc.


When a model is to be released to the public (as some of our clients require) then there can be a range of implementation environments.  This extends the need for deployment testing, which increases costs and turnaround time. This is something that needs to be built-into the plan from the outset.


8. Model Use

In many organisations the model users are not necessarily the model developers.  It is important that users, both at present and in the future, can use a model correctly and understand its features and limitations.


At a minimum, some sort of user documentation (or help file) is a necessity, but more substantive user training may be required.  This could cover both use of the software and effective use of the model:


  • Model assumptions

    • In what circumstances is it invalid to use the model?

    • Sensitivity and uncertainty analyses


You’ll also need to consider what user support may be necessary over the entire model lifecycle, not just in the immediate term.


9. Control and Monitoring

Control and Monitoring is required both during development and use.  Examples include:


  • Processes and tools to support version and change control

  • Access control levels that allows distinctions to be made between users of the model and those who can maintain/develop it

  • A review cycle that assesses:

    • Whether the model is still being used within the limits for which it was designed (i.e. are the assumptions valid given the context of its current use)

    • Whether any model data is still valid



10. The QA Process

Quality Assurance is fundamental to managing analytical model risk. It validates the model development and management processes. It also assures that the type of activities referred to in this infographic are being performed in practice. It results in recommendations as to what improvements can be made.


Though these components are described above as if occurring in sequence, this is rarely the case.  One challenge in putting together an iterative approach to model development, for example, is how these components are distributed and addressed across the iterations.  More on that in another post.

bottom of page