Is HP Labs’ supercomputer, The Machine, the new hope for too-big data? – SiliconANGLE (blog)

Home » エレクトロニクスブランド » Is HP Labs’ supercomputer, The Machine, the new hope for too-big data? – SiliconANGLE (blog)
エレクトロニクスブランド, ヒューレット・パッカード HP コメントはまだありません

With practically limitless data and applications demanding microseconds-fast insight, it’s poor timing that Moore’s law of perpetually increasing processor power is now AWOL.

“How do we get back exponential scaling on supply to meet this unending, exponential demand?” asked Kirk Bresniker (pictured, right), fellow, vice president and chief architect at HP Labs, at Hewlett Packard Enterprise Co.

We will not regain it through the familiar technologies of the past three decades, nor a single point solution, Bresniker stated in an interview during HPE Discover in Las Vegas, Nevada.

This is borne out each day in HP Labs generally and in the company’s ongoing work on The Machine, its memory-driven compute program, according to Andrew Wheeler (pictured, left), fellow, vice president and deputy director of HP Labs.

Bresniker and Wheeler spoke with John Furrier (@furrier) and Dave Vellante (@dvellante), co-hosts of theCUBE, SiliconANGLE Media’s mobile live streaming studio, during HPE Discover. (* Disclosure below.)

After some mixed press for The Machine last December, HPE has been doggedly pushing it closer to prime time production, Wheeler explained.

“There are a lot of moving parts around it, whether it’s around the open-source community and kind of getting their head wrapped around, what does this new architecture look like?” Wheeler said.

The Machine will require a chain of partners and ancillary parts to yield real use-cases, Wheeler added.

“We had the announcement around DZNE as kind of an early example,” he said, referring to the German Center for Neurodegenerative Diseases’ use of The Machine in analyzing massive medical data.

The metastasizing machine

The Machine has also materialized what HPE calls the “Computer Built for the Era of Big Data,” a massive system running on a single memory.

Internet of Things data and, specifically, the intelligent edge are calling out for data training abilities like those in this supercomputer, according to Bresniker. Presently, almost all data ingested at the edge is thrown away before it’s analyzed, let alone monetized, he added.

“The first person who understands, OK, I’m going to get one percent more of that data and turn it into real-time intelligence, real-time action — that will unmake industries, and it will remake new industries,” Bresniker concluded.

Watch the complete video interview below, and be sure to check out more of SiliconANGLE’s and theCUBE’s independent editorial coverage of HPE Discover US 2017(* Disclosure: TheCUBE is a paid media partner for HPE Discover US 2017. Neither Hewlett Packard Enterprise Co. nor other sponsors have editorial control on theCUBE or SiliconANGLE.)

Photo: SiliconANGLE