Sign up FAST! Login

The $10 Hedge Fund Supercomputer That’s Sweeping Wall Street Thanks to Nvidia GeForce GPU Chips

The 10 Hedge Fund Supercomputer That s Sweeping Wall Street Bloomberg Business


McKee, 35, is part of a wave of math and computer whizzes that’s pushing data science to new heights across Wall Street. What’s remarkable about their efforts isn’t that AI science fiction is suddenly becoming AI science fact (sorry, Steven Spielberg). It’s something more mundane: thanks to cloud computing, mind-blowing data analysis is getting so cheap that many businesses can easily afford it.

Stashed in: Amazon Web Services, Awesome, Turing, Singularity!, Accelerating Returns, GPU

To save this post, select a stash from drop-down menu or type in a new one:

I'm disappointed this article does not explain why these AI computations are suddenly so cheap.

Five years ago, the sort of programming involved in McKee’s 1-trillion-point dense matrix would have taken months of coding and $1 million-plus of hardware. Now McKee simply logs onto Amazon Web Services to name his price for computing capacity and sets his code loose. Out of a loft in the Flatiron District in Manhattan, he works on what he calls “coffee time.” His goal is to make every model -- no matter how much data are involved - - compute in the time it takes him to putter to his office kitchen, brew a Nespresso Caramelito, and walk back to his desk.

The article only tells part of the story. 

The rise of cloud computing is making much of this cheaper and faster. It’s also helping spawn a new AI industry. In all, 16 AI companies got initial backing from venture capitalists in 2014, up from two in 2010, according to data compiled by researcher CB Insights for Bloomberg. The amount invested in the startups -- some of which describe themselves as doing machine learning or deep learning -- soared to $309.2 million last year, up more than 20-fold from $14.9 million in 2010.

It's more than just "cloud computing". It's the rise of GPU chip techniques. 

I think it's all quant trading which means there's floating point numbers in every one of his spreadsheet cells and GPGPU accelerates at number crunching those things now.  Plus I think clustered AI is easier now to run across multiple machines and AWS allows elastic instances where you can bid for spare computing and dynamically launch as many as you need for a very short period of time.   They do mention that Sensai company that is doing unstructured text which sounds really interesting. 

 I guess owning the machines, paying for the onsite maintenance, and hiring a software staff to build something that scaled across machines a few short years ago would have cost millions. 

nvidia gpu gflop performance chart infographic

Why has the NVIDIA GPU single precision seemingly gone exponential?

Market demand for single-precision floating point is higher than for double precision. So this year with the Maxwell architecture (GeForce GTX Titan X) Nvidia actually decided to REPLACE circuitry that was reserved for double precision with more single-precision units. Next year with the Pascal Architecture they will re-emphasize double precision again. The Kepler architecture (Tesla K20, K40) was for both single and double precision.

And yes, we are having a lot of fun with Nvidia GPU's for all the deep learning stuff nowadays!

see here for more info on the Maxwell, single-precision optimized architecture:

Ajay, thank you for that spec and the explanation.

Seems like Nvidia at an $11 billion market cap could be a takeover target for anyone who wanted to lock up these next gen GPU chips to have an edge on deep learning applications in the coming years. 

You May Also Like: