Actian Approaches One Year; Delivers Vectorwise Appliance With Lenovo
Just ahead of the first anniversary of its corporate makeover, Actian has introduced its Vectorwise Data Mart Appliance, in partnership with Lenovo. Initial models – based on Lenovo’s ThinkServer RD360 server – come in 100 gigabyte and one terabyte flavours. Formerly known as Ingres, Actian continues to develop and sell the open source transactional database...
AMD SeaMicro Server Offers 512 Cores and 5PB Storage; And Intel Chips!
AMD’s just announced SeaMicro SM15000 server and Freedom Fabric Storage is pushing density limits by packing up to 512 compute cores with 4TB of RAM into a 10 RU system, connecting into up to more than five petabytes of disk storage. Unusually, the first units available will feature chips from rival Intel, with models based...
Tervela, Teradata Partner To Move and Distribute Big Data
Messaging specialist Tervela is partnering with analytics database vendor Teradata, to deliver solutions for moving and distributing big data. The alliance will help customers upload their data into Teradata’s data warehouse for analysis, and will also allow rapid distribution of data across multiple warehouses. “We are seeing demand for high performance data movement in capital...
Q&A: ParStream’s Mike Hummel on Bringing Low Latency to Big Data
Bringing low latency to the world of big data is what ParStream – which recently raised $5.6 million in series A funding – has been working on now for several, years, with some impressive results. We talked to the company’s CEO Mike Hummel to find out more about the company and its technology. Q: First,...
IBM Pitches EC12 Mainframe at Big Data, Cloud Apps
The result of more than $1 billion in R&D, IBM has introduced its latest System z mainframe, the EC12 – and it is pitching it at enterprise data and cloud applications. The mainframe is based on IBM’s proprietary chips, featuring 32 nanometre designs and running at 5.5GHz. Also included is transactional memory, a variant of...
Market Participants Fund Research in Supercomputing/Data Intensive Science For Financial Markets
A number of financial market participants are funding research into the use of supercomputing and data intensive science directed at improving the stability, regulation and enforcement of U.S. markets. The $100,000 funding is being directed to the Centre for Innovative Financial Technology at Lawrence Berkeley National Laboratory. The funders are Tudor Investment Corp., AJO Partners,...
Corvil Pushes Latency Management Towards Big Data
With an effective storage capacity of 60 terabytes, Corvil’s new CNE-7300 latency measurement appliance can retain tick data over an extended period, potentially making it useful as a data archive for ‘big data’ applications beyond latency management. According to Donal O’Sullivan, head of product management at Corvil, the CNE-7300 will likely be able to store...
Big Data – The Other Side of Low Latency
We write a lot here about the latency of moving data from point A to point B. But latency is also inherent in the processing of that data at points A and B. Most likely, the processing of data is a more complex undertaking than transporting it. And that’s where big data comes in. For...
Infochimps Rolls Platform Update; Adds Realtime, Streaming Capabilities
Infochimps has released version 1.1 of its cloud-hosted Big Data Platform, adding the capability to connect to streaming data. The updated Data Delivery Service – based on Apache Flume – allows the platform to connect to enterprise data sources via either out-of-the-box or custom-developed connectors. Within the DDS is the ability to process data streams,...
Big Data in Financial Services – Opportunity or Cost?
By John Bantleman, RainStor www.rainstor.com We recently hosted a dinner in New York City with 20 technology executives focused on big data in banking and financial services. I found the event insightful, so I thought it would be interesting to share some of the perspectives from those who attended. The first (and close to my...