One of the top five banks in the US, a 150-year old organization with assets over $1.7 trillion, was using a legacy tool for their load testing requirements. Among the applications being covered by the said tool was the bank’s Fraud Management System, which is responsible for identifying fraudulent activities among millions of transactions (occurring daily at a rate of thousands per second).
It was of utmost importance that the lab environment handled real-world load scenarios in order to replicate issues occurring in the production environment. The legacy tool was unable to generate the quality traffic via synthetic load generation, which lead to an inability to reproduce or highlight issues with the Fraud Management System, as the application’s production-like code path were not getting exercised.
PERFORMANCE TESTING PLATFORM TO IDENTIFY ISSUES BY REPLICATING REAL-WORLD SCENARIOS
Serving over 70 million customers globally, achieves unparalleled insights across their lab and production environment by using NetStorm to replicate real world traffic and identify anomalies with critical applications.
– Vice President, IT
Legacy load testing solution unable to replicate real world load scenarios in lab environment leading to production issues not being resolved.
The bank resolved applications issues by replicating production traffic in test setup, which also further helped in reducing Mean Time To Repair (MTTR) due to the faithful test environment.
The bank’s QA team created synthetic load testing scripts (typical to most legacy load testing solutions) via their usual load testing solution to simulate the load on Fraud Management application. The application scores the risk profile of the transaction based on complex logic using (but not limited to) the transaction type, frequency, current location, history, and customer profile initiating the transaction. The legacy Load testing could not produce a quality load test, as the application behaviour that was subjected to the load test was noted to be quite different from the system behaviour during production.
A critical feature of NetStorm, the Production Access Log Replay, was used to import production logs and simulate the exact traffic as witnessed in the bank’s production environment. A database copy from production was taken and access logs were replayed to simulate scenarios being seen in real-world cases, which lead to a high fidelity replication that matched the behaviour of the production system. This was a key objective for the bank, which sought to resolve the issues faced in production.
Thousands of transactions per second, matching the production character in terms of the exact traffic pattern and transaction details were generated via NetStorm to match the production parameters in the lab environment. Once the lab environment started ingesting the real-world load, the bank could observe issues occurring that previously did not show up due to the legacy tool’s inability to mimic real-world traffic.
Additionally, by using NetStorm’s easy & intuitive scenario builder, the bank was able to tweak the load generation scenarios and were soon receiving alerts resembling those in the production environment (both in terms of frequency and quality).
NetStorm ensured that one of the bank’s most critical application was adept at handling a high load volume without any issues where legacy tools failed to achieve the objective.
NetStorm’s deployment resulted in some major optimizations being observed:
- MTTR & MTTD (Mean Time to Respond/Detect), two all-important metrics in terms of handling issues, were reduced thanks to early issue identification in the testing environment.
- By going from using multiple performance test servers (12) with the legacy testing solution to a single server with NetStorm,
witnessed a massive reduction in tool’s infrastructure cost and a high degree of test setup reliability.
- Post load generation, NetStorm’s built-in diagnostic capabilities were instrumental in identifying multiple issues in the bank’s application.
- NetStom’s unique capability of having a built-in, geolocation-based IP database helped replicate the geolocation of the user initiating the transaction that was needed to faithfully simulate the production transactions.