HP updates progress towards The Machine

A mockup of The Machine on display at HP Discover

A mockup of The Machine on display at HP Discover

LAS VEGAS – A year after it first introduced The Machine at its HP Discover event here last year, HP Labs provided an update on its progress towards its “memory-centric” computing model.

The Machine is HP Labs’ name for what the company sees as a change in paradigm for computing, from one where computing is the central resource, to one where “memory is the centre of the universe,” tying small, cheap, commodity processing capabilities (think the company’s Moonshot servers) to massive amounts of non-volatile memory to tackle big data and other challenges.

When HP went public with the idea for The Machine, it was closely tied to its research into what it calls memristors as a method of delivering non-volatile memory in huge amounts. But with this year’s update, it seems the company is moving away from suggesting memristors as the only way to what HP CTO Martin Fink brands as “memory-driven computing.”

The first instance of The Machine – a prototype still slated to see the light of day by the end of 2016, will run on common DRAM, as it’s cheap, readily available, and sorta-kinda non-volatile, as long as you keep the power flowing to it. But Fink said beyond that, the company will also provide high-capacity memory via phase-change memory, and then in the long-term, resistive RAM or memristors. By going with the readily available and arguably commodity DRAM out of the gates, HP expects to be able to get The Machine up and running sooner, although getting the product fully prepared and “productized” remains “an end-of-decade project” for the company.

Fink said the prototype Machine will be a one-rack system with support for 320 TB of main memory and 2,500 CPU cores, a huge jump from today’s systems that top out at about 12 TB of main memory on any one rack.

In terms of software, last year HP talked extensively about its plan to create and broadly beta test a custom OS for The Machine. And while that’s still in the works and still the end goal for both very large and very small (especially Internet of Things) instances of The Machine, the company said its number-one OS priority is to get a Linux distribution booting on the new computer, as customers have indicated an interest in applying the high-memory computing model to their existing workloads.

Fink said that ultimately, HP will welcome both friends and foes to deliver workloads on The Machine, mentioning two likely candidates in SAP’s HANA in-memory database, and IBM’s Watson cognitive computing platform as good fits for the memory-centric model. But ultimately, he said, it’s “the new workloads, the new applications that do things you couldn’t do before” that are really interesting to HP.

Fink’s favourite example of such an app is to put all the details of an airline’s operations – every plane, every gate at every airport around the world, every flight crew, ground crew, and gate crew and the equipment they use, all to work in one giant in-memory database, which can then be used to smooth the most frequent air travel hiccups, which emerge from factors like weather-related delays, or even longer-than-necessary waits for a gate when a plane arrives early.

He also painted a picture of a future Machine-based software appliance model, whereby software makers would offer their products on customized chips that plug into The Machine and get access to all of its memory

The company is also touting system-wide security with the new technology, with The Machine being intended to store all information as encrypted by default.

“If The Machine offered no enhancements in performance and no opportunities for new workloads, and the only thing it did were accomplish its security goals, it would still be valuable,” Fink predicted.

After next year’s 320 TB DRAM-based prototype, Fink said the company’s plans is to scale The Machine up into the petabyte range and beyond, with the expectation that it will be on the market by 2020.