One of the research papers published in HKMA said :
Banks in Hong Kong maintain capital adequacy ratios well above the regulatory requirement. … To the extent that part of the high capital buffer is due to the agency problem, information asymmetries, or a mismatch between the expectation of the regulator and banks over the approach to maintaining a capital buffer to prevent a breach of capital requirements, action could be taken to improve the use of capital. In this connection, the initiative under Basel II is expected to help address some of these issues.
Assumed that risks beared by banks can be measured properly according to Basel II implementation, banks can re-invest the excess capital into business. Of course, it is an absolutely ideal case. So, how good does the credit risk system quantified the risk? and does it represent the real picture? From the experience of existing projects, there are two big issues upfront: (a) data quality and (b) underlying risk model.
Take collateral informatin as example, due to historical reason, these information is not recorded in details in most banking systems. Data, such as repoession, valuation, default, may be maintained in spreadsheet manually. It is difficult to conduct precise analysis without a lot of data cleansing work. And even worse, management may not see the need for a automated collateral system. Therefore, even you has paid a lot to data cleansing, if the result is not referred back to daily business. The information would be corrupted soon.
I am not talking about how to extract the information right for one time, but how to obtain the correct information in a streamless way. Many consultants and vendors come up with product for Business Intelligence / Data Warehousing. But, there should be a system allow user to maintain the collateral information correclty.
Underlying Risk Model
(to be continued…)