COVID-19 creates a need for analytics in real time
Regarding the COVID-19 pandemic, Warren Buffet has observed that “we haven’t faced anything that quite resembles this problem” and the fallout is “still hard to evaluate.”
The pandemic has created unprecedented shock to economies and asset performance. The recent unemployment data, although encouraging , has only added to the uncertainty. Furthermore, impact and recovery are uneven, often varying considerably from county to county and city to city. Consider:
- COVID-19 cases and fatalities were initially concentrated in just a few cities and counties resulting in almost a total shutdown of these regions.
- Certain sectors, such as travel and leisure, have been affected worse than others while other sectors such as oil and gas have additional issues. Regions with exposure to these sectors have higher unemployment rates even with fewer COVID-19 cases.
- Timing of reopening and recoveries has also varied due to regional and political factors.
Regional employment, business activity, consumer spending and several other macro factors are changing in real time. This information is available through several non-traditional data sources.
Legacy models are not working, and several known correlations are broken.
Determining value and risk in this environment is requiring unprecedented quantities of analytics and on-demand computational bandwidth.
Need for on-demand computation and storage across the organization
“I don’t need a hard disk in my computer if I can get to the server faster… carrying around these non-connected computers is byzantine by comparison.” ~ Steve Jobs
Front office, risk management, quants and model risk management – every aspect of the analytics ecosystem requires the ability to run large number of scenarios quickly.
Portfolio managers need to recalibrate asset valuation, manage hedges and answer questions from senior management, all while looking for opportunities to find cheap assets. Risk managers are working closely with quants and portfolio managers to better understand the impact of this unprecedented environment on assets. Quants must not only support existing risk and valuation processes but also be able to run new estimations and explain model behavior as data streams in from variety of sources.
These activities require several processors and large storage units to be stood up on-demand. Even in normal times infrastructure teams require at least 10 to 12 weeks to procure and deploy additional hardware. With most of the financial services world now working remotely, this time lag is further exaggerated.
No individual firm maintains enough excess capacity to accommodate such a large and urgent need for data and computation.
The work-from-home model has proven that we have sufficient internet bandwidth to enable the fast access required to host and use data on the cloud.
Cloud is about how you do computing
“Cloud is about how you do computing, not where you do computing.” ~ Paul Maritz, CEO of VMware
Cloud computing is now part of everyday vocabulary and powers even the most common consumer devices. However, financial services firms are still in early stages of evaluating and transitioning to a cloud-based computing environment.
Cloud is the only way to procure the level of surge capacity required today. At RiskSpan we are computing an average of a half-million additional scenarios per client on demand. Users don’t have the luxury to wait for an overnight batch process to react to changing market conditions. End users fire off a new scenario assuming that the hardware will scale up automagically.
When searching Google’s large dataset or using Salesforce to run analytics we expect the hardware scaling to be limitless. Unfortunately, valuation and risk management software are typically built to run on a pre-defined hardware configuration.
Cloud native applications, in contrast, are designed and built to leverage the on-demand scaling of a cloud platform. Valuation and risk management products offered as SaaS scale on-demand, managing the integration with cloud platforms.
Financial services firms don’t need to take on the burden of rewriting their software to work on the cloud. Platforms such as RS Edge enable clients to plug their existing data, assumptions and models into a cloud–native platform. This enables them to get all the analytics they’ve always had—just faster and cheaper.
Serverless access can also help companies provide access to their quant groups without incurring additional IT resource expense.
A recent survey from Flexera shows that 30% of enterprises have increased their cloud usage significantly due to COVID-19.
Cloud is cost effective
“In 2000, when my partner Ben Horowitz was CEO of the first cloud computing company, Loudcloud, the cost of a customer running a basic Internet application was approximately $150,000 a month.” ~ Marc Andreessen, Co-founder of Netscape, Board Member of Facebook
Cloud hardware is cost effective, primarily due to the on-demand nature of the pricing model. A $250B asset manager uses RS Edge to run millions of scenarios for a 45–minute period every day. Analysis is performed over a thousand servers at a cost of $500 per month. The same hardware if deployed for 24 hours would cost $27,000 per month
Cloud is not free and can be a two-edged sword. The same on-demand aspect that enables end users to spin up servers as needed, if not monitored, can cause the cost of such servers to accumulate to undesirable levels. One of the benefits of a cloud-native platform is built-on procedures to drop unused servers, which minimizes the risk of paying for unused bandwidth.
And yes, Mr. Andreeseen’s basic application can be hosted today for less than $100 per month
The same survey from Flexera shows that organizations plan to increase public cloud spending by 47% over the next 12 months.
Alternate data analysis
“The temptation to form premature theories upon insufficient data is the bane of our profession.” ~ Sir Arthur Conan Doyle, Sherlock Holmes.
Alternate data sources are not always easily accessible and available within analytic applications. The effort and time required to integrate them can be wasted if the usefulness of the information cannot be determined upfront. Timing of analyzing and applying the data is key.
Machine learning techniques offer quick and robust ways of analyzing data. Tools to run these algorithms are not readily available on a desktop computer.
Every major cloud platform provides a wealth of tools, algorithms and pre-trained models to integrate and analyze large and messy alternate datasets.