About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

MemSQL Ships Distributed In-Memory Database

Subscribe to our newsletter

Following up on its unveiling in June of last year, San Francisco-based MemSQL has released a distributed version of its eponymous in-memory database, aimed at introducing big data scalability to the low latency performance provided by the initial release.

With its distributed version, MemSQL is providing in-memory performance, with access via the common SQL database access language. The company says the offering is already in use for applications, such as operational analytics, network security, real-time recommendations, and risk management.

MemSQL scales out across commodity hardware, and has already been deployed in production use across hundreds of nodes, with sub-second response times on terabytes of data. Data redundancy and security is provided by duplication across nodes, by checkpoints to physical disk, and across data centres.

The company has worked with Morgan Stanley to create a real-time bond data application that is used by 25,000 financial advisors nationwide. MemSQL allows the development team to balance high-velocity data streaming into the system with a large number of concurrent queries. With MemSQL, Morgan Stanley has been able to manage big data workloads, accelerate development and reduce total cost of ownership by scaling on commodity hardware.

Also new is MemSQL Watch, a browser-based interface for software and hardware monitoring, and system configuration.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: The Role of Data Fabric and Data Mesh in Modern Trading Infrastructures

The demands on trading infrastructure are intensifying. Increasing data volumes, the necessity for real-time processing, and stringent regulatory requirements are exposing the limitations of legacy data architectures. In response, firms are re-evaluating their data strategies to improve agility, scalability, and governance. Two architectural models central to this conversation are Data Fabric and Data Mesh. This...

BLOG

Bloomberg BQuant Wins A-Team AICM Best AI Solution for Historical Data Analysis Award

When global markets were roiled by the announcement of massive US trade tariffs, Bloomberg saw the amount of financial and other data that runs through its systems surge to 600 billion data points, almost double the 400 billion it manages on an average day. “These were just mind-blowingly large volumes of data,” says James Jarvis,...

EVENT

RegTech Summit New York

Now in its 9th year, the RegTech Summit in New York will bring together the RegTech ecosystem to explore how the North American capital markets financial industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

Connecting to Today’s Fast Markets

At the same time, the growth of high frequency and event-driven trading techniques is spurring demand for direct feed services sourced from exchanges and other trading venues, including alternative trading systems and multilateral trading facilities. Handling these high-speed data feeds its presenting market data managers and their infrastructure teams with a challenge: how to manage...