Data Envelopment Analysis in Retail Banking
Since 1983 the Australian financial services sector has undergone extensive deregulation. Increasingly, this has resulted in the need to re-examine and re-design existing bank structures, with a view to coping with increased competition and reducing costs.
This study was undertaken by Dr Necmi K Avkiran, a Senior Lecturer in Financial Studies at the University of Queensland, Australia. Necmi is a member of the Australian Society for Operations Research, the Academy of Management and other professional associations such as the Australasian Institute of Banking and Finance. He has published articles in the Journal of Banking and Finance, Journal of Economics and Finance, International Journal of Human Resource Management, and many others. As a consultant Necmi’s clients include Sinclair, Knight and Merz, and John P. Young & Associates. Data envelopment analysis (DEA) was chosen for this study because, ultimately, it would help with the bank restructuring process and promote DEA as a versatile managerial decision making tool within the client organisation.
The study aimed to assess the productivity of retail activities in 60 branches of an Australian trading bank (the name is withheld for reasons of confidentiality). The services provided by these branches include telling, customer services, housing loans, personal loans, and small business lending.
DEA was chosen as the analysis technique for a number of reasons, including the fact that:
. There is no restriction on the types of variables which can be included in the analysis.
. When using DEA the variables can be measured in different units.
. DEA measures technical efficiency, defined as the successful implementation of a production plan, therefore any deviations from the plan are noticeable.
During the model design phase, the first step was to identify the key performance outcomes (reflecting the corporate objectives) and then to select the factors (variables) that lead to these outcomes. A correlation analysis was conducted as part of the design process to remove highly correlated variables from the data set. The final variables used and their classification were as follows:
. Number of teller windows (controllable input)
. Tangible convenience (controllable input)
. Customer service quality instrument (controllable input)
. Managerial competence of branch manager (controllable input)
. Average annual family income (non-controllable input)
. Proportion of private dwellings rented (non-controllable input)
. Number of small business establishments (non-controllable input)
. Total number of new deposit accounts (output)
. Total number of new lending accounts (output)
. Fee income (output)
The variables tangible convenience, customer service quality instrument and managerial competence of branch manager were introduced as composite variables. For example, tangible convenience consisted of five items: (1) location at regional shopping centre, (2) location adjacent to or within walking distance of a regional shopping centre, (3) number of automatic teller machines, (4) presence of a free car park and (5) proximity to public transportation. Branch customers rated customer service quality on a questionnaire comprising 17 items. Managerial competence was rated by the immediate subordinates of branch managers based on a questionnaire comprising 45 items.
The next step was to decide whether the analysis would focus on input minimisation or output maximisation. As the focus was on cost reduction, input minimisation was deemed to be the most appropriate choice. Output maximisation would be appropriate if the expansion of the market was considered important. As part of the analysis three productivity models were run. The first used all variables, under input minimisation (model 1). The second used only the controllable inputs with the outputs, run under input minimisation (model 2). The third used the uncontrollable inputs with the outputs under output maximisation (model 3). To determine whether the model should be set to constant or variable returns to scale, the relationship between scale of operations and efficiency scores was investigated. From the variables used, the number of teller windows was the best proxy measure of branch size. The correlation between this and the efficiency score was investigated and found to be -0.05, so all three models were run under constant returns to scale.
Despite the advantages of DEA, care is still needed when interpreting the results of the analysis. A branch reported as being 100% efficient is not necessarily producing maximum outputs for the inputs used, rather the branch is 100% efficient relative to its peers. Also, the units (branches) in the analysis should be homogenous so that they can be directly comparable with each other. The efficiency scores derived from the first two models were almost identical with the exception of 4 units. These units appeared as 100% efficient in model 1 but were less efficient in model 2. From this it was apparent that the results from model 1 could not be used in the same decision making context as model 2. When comparing the results of model 1 with model 3, more variation was observed between the scores. This indicated that the scores were sensitive to the specification of productivity model and highlighted the need to select a model that ties in with corporate objectives.
By looking at reference set frequency information in Frontier Analyst it was possible to identify a global leader that other branches could emulate. When doing this it is important to look for an efficient unit which has the most similar input/output characteristics to the inefficient unit rather than just taking the most frequently occurring peer as the unit to emulate.
One of the results of the analysis was a suggested reduction in customer service quality. Clearly, caution and common sense are needed in interpreting and applying such results since reducing customer service quality is likely to cause a backlash from customers. This result may show that the role of customer quality in generating the outputs is overrated in this case.
The analysis conducted allowed for more informed decisions to be made with respect to the branch network. Any type of restructuring calls for the identification of poor performers (which might be closed down or re-engineered), as well as star performers. The DEA approach can also help in establishing the structure of new branches by providing insight into the configuration of successful units and helping with effective allocation of resources. When interpreting the potential improvements indicated by DEA, they should be examined closely to assess whether certain key variables or environmental factors have been excluded from the optimisation process. This is where weight restrictions can also be introduced. Perhaps the most valuable lesson to be learned is that DEA is part of a continual process. Although it can identify performance targets, it does not tell us how to achieve these targets. This is where your knowledge as a manager comes in.
Source: The above case study summarises Chapter 4 in Avkiran, N.K. (2000) Productivity Analysis in the Services Sector with Data Envelopment Analysis, first edition revised, pp.45-63, Queensland, Australia: N K Avkiran. Printed with permission of the author.
With thanks to Necmi Avkiran, for his time and co-operation in the preparation of this case study