About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Intel Enters Hadoop Fray

Subscribe to our newsletter

Intel has introduced its own distribution of Hadoop, incorporating enterprise level features for security, performance and management. The IT giant has also announced a number of partners for its offering, which it will sell on a subscription basis, ranging from the likes of Cisco Systems and Dell, to MarkLogic, SAP and Teradata.

As part of its distribution, Intel has made software updates to a number of Hadoop components, including the HDFS file system, Yarn distributed processing framework, Hive SQL interface and HBase columnar store.  These updates have been contributed back to the Apache open source project, on which Intel’s distribution is based.

Performance enhancements include optimisation for solid state disks and cache acceleration, and hardware-based encryption and decryption leveraging the AES instructions of Intel chips.

Intel has also introduced a proprietary module – Intel Manager for Apache Hadoop – which provides additional functionality for deployment, management, monitoring and security.

As well as its own Hadoop distribution, Intel is continuing to develop its Graph Builder visualisation tool for analysing Hadoop-based data. It has also made investments in 10gen and its MongoDB NoSQL database and in operational analytics specialist Guavus.

In introducing its own distribution, Intel expects to accelerate deployment of Hadoop – and sales of its microprocessors, SSDs and networking products alongside.  That will mean increased competition for the big three Hadoop startups – Cloudera, Hortonworks and MapR Technologies, who all offer distributions with their own added features and functionality.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: The Role of Data Fabric and Data Mesh in Modern Trading Infrastructures

The demands on trading infrastructure are intensifying. Increasing data volumes, the necessity for real-time processing, and stringent regulatory requirements are exposing the limitations of legacy data architectures. In response, firms are re-evaluating their data strategies to improve agility, scalability, and governance. Two architectural models central to this conversation are Data Fabric and Data Mesh. This...

BLOG

Shifting from Traditional Buy-or-Build Models to a More Agile Buy-AND-Build Approach

In this special edition of FinTech Focus TV, recorded live at the TradingTech Briefing in New York City, Toby Babb from Harrington Starr speaks with Matt Rafalski, Head of Sales for North America at Velox. Matt shares his insights into how capital markets firms are moving beyond the traditional buy-or-build dilemma and embracing a more...

EVENT

Data Management Summit New York City

Now in its 15th year the Data Management Summit NYC brings together the North American data management community to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

Enterprise Data Management, 2010 Edition

The global regulatory community has become increasingly aware of the data management challenge within financial institutions, as it struggles with its own challenge of better tracking systemic risk across financial markets. The US regulator in particular is seemingly keen to kick off a standardisation process and also wants the regulatory community to begin collecting additional...