About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Intel Enters Hadoop Fray

Subscribe to our newsletter

Intel has introduced its own distribution of Hadoop, incorporating enterprise level features for security, performance and management. The IT giant has also announced a number of partners for its offering, which it will sell on a subscription basis, ranging from the likes of Cisco Systems and Dell, to MarkLogic, SAP and Teradata.

As part of its distribution, Intel has made software updates to a number of Hadoop components, including the HDFS file system, Yarn distributed processing framework, Hive SQL interface and HBase columnar store.  These updates have been contributed back to the Apache open source project, on which Intel’s distribution is based.

Performance enhancements include optimisation for solid state disks and cache acceleration, and hardware-based encryption and decryption leveraging the AES instructions of Intel chips.

Intel has also introduced a proprietary module – Intel Manager for Apache Hadoop – which provides additional functionality for deployment, management, monitoring and security.

As well as its own Hadoop distribution, Intel is continuing to develop its Graph Builder visualisation tool for analysing Hadoop-based data. It has also made investments in 10gen and its MongoDB NoSQL database and in operational analytics specialist Guavus.

In introducing its own distribution, Intel expects to accelerate deployment of Hadoop – and sales of its microprocessors, SSDs and networking products alongside.  That will mean increased competition for the big three Hadoop startups – Cloudera, Hortonworks and MapR Technologies, who all offer distributions with their own added features and functionality.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: The Role of Data Fabric and Data Mesh in Modern Trading Infrastructures

The demands on trading infrastructure are intensifying. Increasing data volumes, the necessity for real-time processing, and stringent regulatory requirements are exposing the limitations of legacy data architectures. In response, firms are re-evaluating their data strategies to improve agility, scalability, and governance. Two architectural models central to this conversation are Data Fabric and Data Mesh. This...

BLOG

Genesis Launches MCP Server to Bridge AI Agents with Enterprise Applications

Genesis Global, the application development framework provider, has introduced a Model Context Protocol (MCP) Server to provide a controlled and compliant interface between large language models (LLMs) and enterprise applications built on its platform. The MCP Server supports the emerging open MCP standard, which aims to unify how software applications deliver operational and contextual data to...

EVENT

Buy AND Build: The Future of Capital Markets Technology

Buy AND Build: The Future of Capital Markets Technology London examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions.

GUIDE

Putting the LEI into Practice

Hundreds of thousands of pre-Legal Entity Identifiers (LEIs) have been issued by pre-Local Operating Units (LOUs) in the Global LEI System (GLEIS), and the standard entity identifier has been mandated for use by regulators in both the US and Europe. As more pre-LEIs are issued ahead of the establishment of the global systems’ Central Operating...