Supercharge MATLAB with your graphics card?

May 26th, 2009 | Categories: math software, matlab, programming | Tags:

Did you know that your graphics card is effectively a mini-supercomputer?  Your main CPU (Central Processing Unit) probably has 2 processor cores, 4 if you are lucky but a high end graphics card can have as many as 96 GPUs (Graphics Processing Units) – which is a lot.  Even my laptop’s relatively low end NVIDIA GeForce 8400MS has 16 ‘stream processors‘ according to this Wikipedia page.

The large number of cheap processor cores is the good news.  The bad news is that they are not as capable as fully fledged Intel or AMD processor cores since, as you might expect, the cores in your graphics card are rather specialised.  They were designed specifically to do the mathematics behind graphics processing and they do this very well indeed but until fairly recently it was rather difficult to get them to do much else.

That hasn’t stopped people from trying though. Some time ago,NVIDIA, the makers of my laptop’s graphics card, released a software library called CUDA which enables C-programmers to access the vast computational power locked away in a typical pixel pusher.  The results have been nothing short of astonishing.  One developer, for example, recently demonstrated how to use CUDA to calculate the properties of the Ising Model (A staple in undergraduate computational physics courses) over 60 times faster than a single, bog standard Intel CPU.

If you are impressed with a factor-60 speed up then the 675 times speed up reported by Michał Januszewski and Marcin Kostur is really going to knock your socks off.  Yep – that’s not a typo.  They have written code that can solve certain Stochastic Differential Equations SIX HUNDRED AND SEVENTY FIVE TIMES FASTER than a single, standard CPU core.  Your shiny new dual quad-core workstation isn’t looking so clever now is it?  Not bad for technology designed for playing the latest version of quake on.

This is all well and good but I don’t have the time or the mental stamina to code in C anymore.  What I want is for all of my favourite Mathematica, MATLAB or Python functions to be CUDA-ised so that I can enjoy a big speed up at low cost and low programming effort.  I’ll take the moon on a stick while I’m at it if you don’t mind.

Well, it seems that some people are doing exactly this.  I have just stumbled across a project called GPUmat which claims to offer up to 40x speedup with very little effort on the part of the user.  One example they give considers the following standard MATALB code.

A = single(rand(100)); % A is on CPU memory
B = single(rand(100)); % B is on CPU memory
C = A+B;    % executed on CPU.
D = fft(C); % executed on CPU

To get this running on your graphics card all you need to do is (after you’ve installed the toolbox and CUDA of course)

A = GPUsingle(rand(100)); % A is on GPU memory
B = GPUsingle(rand(100)); % B is on GPU memory
C = A+B;    % executed on GPU.
D = fft(C); % executed on GPU

Very nice. I’m not sure what MATLAB functions are supported but I guess it’s all there in the documentation – I just haven’t had time to look through it. I’d love to tell you what sort of speed-up I experienced when I tried it out but, unfortunately, the developers are asking for all potential users to register before they get access to the downloads and that put me off a bit

It’s all free though so if you’d like to check it out yourself, and you don’t mind the registration, then head over to the developer’s website. I’d love to hear how you get on.

PS: Make sure your graphics card is CUDA compatible first though.  You’ll waste a lot of time trying out this software if it isn’t!

  1. May 27th, 2009 at 00:51
    Reply | Quote | #1

    I’m a developer at AccelerEyes. Over the last year we built a similar tool called Jacket. Compared to GPUmat, it supports more functions, handles all memory management for you, lets you interactively develop OpenGL visualizations of your data, and more. You can check out some of the demo videos: http://www.accelereyes.com and see visualizations from several end-to-end applications: http://www.accelereyes.com/graphics_toolbox.php.
    -James

  2. HS
    May 27th, 2009 at 05:44
    Reply | Quote | #2

    Thanks, this is superinteresting for me. I’ll definitely try it out once I get my Mac Pro. My moon on a stick (never heard that one before!) would be a SimPy-like library that makes use of that.

  3. Mike Croucher
    May 27th, 2009 at 10:02
    Reply | Quote | #3

    Hey HS – the ‘moon on a stick’ phrase comes from a relatively obscure British comedy in the mid-90s. It was aired during when I was a teenager and ended up entering my vocabulary.

    Do Mac Pros have NVIDIA graphics cards…(excuse me while I google)…seems they do. Sometimes.

  4. Mike Croucher
    May 28th, 2009 at 15:58
    Reply | Quote | #4

    @James

    Sorry for the delay in getting your comment through moderation – WordPress put you in the spam bin for some reason. Your product looks fantastic – I’ll have a play with the free trial at some point in the future asnd maybe write it up here.

    I found it difficult to get hold of pricing information though. Your site wanted me to register just to get an idea of the cost which is a bit odd I feel.

    What is everyone’s obsession with requiring registration these days?

    Cheers,
    Mike

  5. May 31st, 2009 at 05:13
    Reply | Quote | #5

    Mike,
    Thanks for posting the comment. You have to log in to see the pricing. We’re using that as a way to get information on who’s interested. Let me know if you have any trouble.
    When installing on the MacPro, be sure to get the NVIDIA Toolkit in your PATH variable after installing. Check the User Guide to see how to update either your ~/.MacOSX/environment.plist or .bashrc/tcshrc/etc.
    -James

  6. June 5th, 2009 at 08:11
    Reply | Quote | #6

    Hi,

    I am Marco, a GPUmat developer. Thanks for writing about us. We really appreciate all the suggestions, and I have would like to ask you the following:

    We may go open-source some day, right now we just give out the library as a Freeware. Why do you think the registration is a problem? I mean, what are you real concerns about the registration? I am asking because we might change this, but we would also like to understand why is people afraid of registering to download a free tool.

    Many thanks for your suggestions
    Marco

    And then just a comment to James Malcolm :)
    1) We also handle memory management automatically
    2) We will support more functions in the future
    3) We are freeware and may go open-source one day

  7. Mike Croucher
    June 7th, 2009 at 19:32
    Reply | Quote | #7

    Hi Marco. Sorry for the delay in allowing you through moderation but I have been on vacation and had no net connection at all.

    I am against registration for several reasons. The first is that it is a barrier between users and your product. I have read research that suggests that some casual users lose interest in something if it takes too many CLICKS to get to – let alone registration. If it takes one click to get your product then many users will probably try it out – even if they are only a little bit interested. If the product is good then they’ll use it more and maybe contact you once they know more about it (I would have).

    If, instead of one click, that user then has to go through a registration process then it’s quite possible that they simply won’t bother. Time is short and so only people who REALLY want your product will bother with it. By requiring registration you have lost a set of casual users who may well have turned into power users.

    Would Firefox have built up such a strong user base if you had to register to get at the downloads?

    My next concern with registration is that I worry why someone wants my details in the first place. What do they plan on doing with the information I send them? Since I don’t know then I prefer to keep the information to myself (or supply fake information if I can possibly get away with it).

    Finally, if someone requires a registration process then their product isn’t really free. It costs some personal information and that’s a price that some may not want to pay.

    I’m really looking forward to trying your product out :)

    Cheers,
    Mike

  8. John Butcher
    October 14th, 2009 at 14:12
    Reply | Quote | #8

    Hi there. Interesting blog. I was wondering if you had had much experience of using either GPUmat or Jacket? I’ve been trying to convert some code I have which simulates a neural network but have found out that quite a few things still need to be added to GPUmat. For example, you can’t do matrix division unless you use matlab scalars. There also appears to be little in support of structs so I have to copy data forwards and backwards from the CPU to the GPU which causes some overheads. I also am struggling in dealing with cell matrices, as again, there appears to be little support for these too. Have you had similar issues? If so, any ideas, using cells on the GPU is causing me lots of headaches!

    Cheers,

    John

  9. Mike Croucher
    October 14th, 2009 at 15:37
    Reply | Quote | #9

    Hi John

    I have a lot less experience than you it seems! I’d try speaking to the developers to see what they say if I were you – sorry I can’t be more help.

    Best Wishes,
    Mike

  10. Griff
    October 22nd, 2016 at 02:16

    Your developed site link goes to a Japanese website that Google translates as a weird editorial about some guy getting sick on the job (potentially sick of the job?) entitled “There job was father.” Is the link outdated?

  11. Mike Croucher
    October 26th, 2016 at 17:39

    Hi Griff.

    Yes, very outdated. This article is 7 years old!
    These days, I suggest using Mathworks’ Parallel Computing Toolbox for GPU computing.