"That's the vision", stated James D'Arezzo, CEO of Condusiv Technologies. "The reality is more complicated. Unless banks deal with the performance issues that AI will cause for ultra-large databases (typically run on Microsoft SQL servers), they will not be able to take the money gained by eliminating positions and spend it on the new services and products they will need in order to stay competitive."
Looming technical issues notwithstanding, senior banking executives increasingly agree on the inevitability of AI-based services and an accompanying major reduction in industry workforce. Former Deutsche Bank CEO, John Cryan, predicted a "bonfire" of industry jobs as automation takes hold across the sector, noting that increased processing power, Cloud storage and other developments make possible tasks that had previously been seen as too complex for automation. More recently, Vikram Pandit, who ran Citigroup during the financial crisis from 2007 to 2012, said that developments in artificial intelligence and robotics could reduce the need for banking staff in support roles by as much as 30% over the next five years.
Industry observers note that there is significant demand for new automated capabilities. Among the potential use cases for AI in financial services are data-driven management decisions at lower cost; automated customer support; fraud detection and claims management; insurance management; automated virtual financial assistants; predictive analysis in financial services; and wealth management advisory services offered to lower-net-worth market segments.
James D'Arezzo, whose company is the world leader in I/O reduction software, noted that providing the IT capability to support this wave of AI will not be a trivial matter. Financial industry CIOs are already under pressure to provide timely responses to increasingly complex requests for data analysis, and AI applications - which are far more demanding in terms of both data throughput speed and quality - will vastly increase that pressure.
Intensive hardware upgrades, often cited as a solution to this problem, may often simply make matters worse. James D'Arezzo cited as an example a recent announcement that the Tokyo Institute of Technology Global Scientific and Computing Center has begun the development of a new supercomputer, the Tsubame 3.0, to meet the demands of artificial intelligence and Big Data applications. The Japanese Ministry of Economy, Trade and Industry has committed to investing $1.9 billion in the project.
"This machine is not on the market yet", stated James D'Arezzo, "and we have no idea what it will cost when it becomes available. Existing supercomputers, however, tend to weigh in at $50 million to several hundred million, plus implementation and data transition costs. For most financial institutions, this approach to the AI challenge - throwing hardware at it - not only negates the cost-reduction advantages AI has to offer, but is unnecessary. A much better way for the industry to manage AI is to increase the performance of its existing infrastructure, particularly SQL databases, which could produce the same efficiencies at a small fraction of the cost of new hardware."