About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Metamako Adds MetaProtect Firewall to Portfolio of FPGA Network Appliances

Subscribe to our newsletter

Metamako has extended its portfolio of field programmable gate array (FPGA) enabled network solutions with MetaProtect Firewall, a network appliance designed to deliver ultra-fast firewall protection and solve problems including situations where a firewall is mandatory but ultra-low latency and high port density are also required.

The firewall solution takes Metamako into the security space for the first time and builds on its growth plans including the company’s recent and inaugural acquisition of Chicago-based xCelor’s hardware business.

MetaProtect is a 48-port (x10GbE) network appliance that performs packet filtering in 130 nanoseconds, as well as comprehensive logging for the filters. It is flexible in how it can be configured, including the ability to specify ports that don’t need to be filtered, in which case packets are passed through in 5 nanoseconds.

Dave Snowdon, founder and chief technology officer at Metamako, says: “Clients have seen the benefits of using our low-latency devices and asked if we could improve their firewall architecture. We were able to draw on our flexible FPGA platforms and app infrastructure to very quickly build the right product for those customers and the result is MetaProtect – a low latency firewall.”

Considering situations that mandate a firewall, Snowdon suggests exchanges in Asia, for example the Korean Stock Exchange (KRX), which stipulate that a broker must ‘own and manage’ a firewall between a client’s trading servers and the exchange. The latency penalty that this introduces is a problem for trading participants, but it can be eased using Metamako’s ultra-low latency, high-density firewall solution to improve exchange-facing architecture.

Key functionality of MetaProtect includes: ultra-low latency filtering with average latency of 130 nanoseconds (1 rule) to 155 nanoseconds (510 rules); extreme determinism, a tightly bound maximum latency for each configuration; up to 510 rules per port; extensive packet statistics for all ports for advanced network monitoring; and comprehensive logging, including logged statistics of permitted and denied packets.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: Unlocking value: Harnessing modern data platforms for data integration, advanced investment analytics, visualisation and reporting

4 September 2025 10:00am ET | 3:00pm London | 4:00pm CET Duration: 50 Minutes Modern data platforms are bringing efficiencies, scalability and powerful new capabilities to institutions and their data pipelines. They are enabling the use of new automation and analytical technologies that are also helping firms to derive more value from their data and...

BLOG

Integrating AI into Legacy Infrastructure – Balancing Innovation and Reality

Integrating cutting-edge AI into the financial sector’s entrenched legacy systems is one of today’s most pressing challenges. How can firms balance the promise of innovation with the hard reality of their existing infrastructure? At the A-Team Group’s recent AI in Capital Markets Summit London, a panel of experts tackled this very issue. Moderated by Naomi...

EVENT

AI in Capital Markets Summit London

The AI in Capital Markets Summit will explore current and emerging trends in AI, the potential of Generative AI and LLMs and how AI can be applied for efficiencies and business value across a number of use cases, in the front and back office of financial institutions. The agenda will explore the risks and challenges of adopting AI and the foundational technologies and data management capabilities that underpin successful deployment.

GUIDE

The Reference Data Utility Handbook

The potential of a reference data utility model has been discussed for many years, and while early implementations failed to gain traction, the model has now come of age as financial institutions look for new data management models that can solve the challenges of operational cost reduction, improved data quality and regulatory compliance. The multi-tenanted...