Select Page
The Client

Leading global investment bank: Corporate and Investment Bank (CIB) division

About Engagement
  • Industry: Financial services / Financial market data
  • Functional area: Global financial markets
  • Technologies:
    • Business logic: Java, Spring, Apache Storm, Hadoop, Activiti, SAP Hana
    • GUI: angular.js, GWT, REST, Tomcat
    • Integration: Solace/JMS, Oracle Coherence, Apache Camel, SOAP
    • Storage: WebDAV, HDFS
    • Database development: MarkLogic, Oracle
    • Platforms: SLES, Solaris
  • User profile: Specialist B2B financial market systems integration
  • System geography: Worldwide (EU, US, APAC)
  • Number of users: 200+ users in 10+ locations
  • Relationship status: Ongoing ESP and client side team: 18+ technical specialists; 3+ project architect; 2+ PMs
  • Duration: 18+ months

Business Challenge

Our client operates in the investor services field of corporate and investment banking and uses various types of data. Within this investor services landscape, all systems use their own local data dictionaries and translation rules. As a result, inconsistent data quality and inadequate access to real-time data for end-of-day / batch-driven processing was an issue. These limitations made it impossible to gain a global view of investor services intraday credit risk exposure for clearing and custody services.

The client therefore wanted to develop a data service platform as the basis of an improved strategic risk management system. It would take transactional, dynamic and static data from the investor services cash and stock back-end systems, plus reference and market data from external sources and transform then to create a standardized view. This view would be stored for use by the risk processing components to produce risk-related calculations, reports and decisions.

Working closely with our client as part of a joint team, we developed and delivered a data storage and processing system to aggregate and merge structured and unstructured local data. The system is effectively resolving the challenges of local data dictionaries and translation rules used for different internal and external sources.

Our Solution

Our solution is a data service platform that all the investor services systems can use to gain a standardized view of the transactional and reference data. It is a high-performance data storage and processing system created with leading-edge technologies like Apache Storm, MarkLogic, Solace and Hadoop.

The solution transforms diverse data from disparate sources to create a standardized view with guaranteed data quality for system users. The bank can define business rules and apply them to incoming data streams, revealing exceptions that can indicate the source of concrete records.
The system supports two types of interaction:

Request response-style service
  • here the client sends a request and a payload is sent out with the response
Data streaming service
  • this produces a stream of data entities, for example, real-time updates on deals, transactions and balances, etc.

Intraday updates on deals, transactions and balances usually flow into the Platform from source systems as JMS messages and are stored in the master dataset built with MarkLogic DB.

End-of-day updates are usually transferred to the Platform as files with SFTP, like proprietary solutions. Files are then parsed according to rules predefined between source systems and the Platform and stored in MarkLogic.

For all entities stored in MarkLogic for up to 10 years, a bi-temporal pattern is applied to ensure ongoing conformity to audit and regulatory requirements. Once captured, deals, transactions and balances are enriched with reference data and transformed into a standard financial representation in accordance with the ISO-20022 financial standard.

The development of various business intelligence (BI) instruments like risk calculators can be based on ISO-20022 compliant documents published by the Platform to monitor, approve and manage deals and transactions. If straight through processing (STP) is not possible, there is also the capability to fall back to manual decision making.

Apache Storm is a primary tool used to build topologies that capture, store, enrich and transform deals, transactions and balances in order to provide a standardized data view. It is also used for calculators to support the near real-time non-functional need for end-to-end processing within two seconds. Rich internet application (RIA) user screens are mainly built with proprietary software development kit (SDK) using angular.js with REST endpoints.

The core of the system is designed to provide a standardized view of the financial data (transactional and reference) to all the investor services systems, including – but not limited to – trade finance activities.


Working closely with ESP has equipped our client with an in-depth understanding of the latest technological approach, associated issues and strategies to improve the development process. In helping to build the new data services platform while coaching the client’s analyst, development and test teams, we are bringing in-house the development and maintenance expertise needed to upgrade and support the system.

Having resolved the flexibility and scalability challenges, the new high-performance data storage and processing system is enabling the client to formalize and clearly document the data processing steps. The data service platform has become the ‘golden data source’ for the business system users, therefore forming the basis of the improved strategic risk management platform.

In addition, the platform provides the extra capacity required to store and retrieve all historical data, process data in real-time and maintain multiple users concurrently. Additional benefits include:

  • Flexibility – real-time data processing enables a faster, more proactive response to market changes and risk management triggers
  • Transparency – clear separation of all business logic elements based on their role in the data processing chain
  • Efficiency – distributed computing and caching enables more efficient utilization of hardware resources
  • Scalability – horizontal clustering built on leading-edge technologies has given the bank the increased capacity to cope with more data and manage future needs.

And last, but not least, the successful implementation has created development expertise and experience that can be re-used in different areas of the bank to drive further competitive advantage.