{"id":2646,"date":"2010-05-17T05:10:02","date_gmt":"2010-05-17T04:10:02","guid":{"rendered":"http:\/\/www.walkingrandomly.com\/?p=2646"},"modified":"2010-05-17T05:14:46","modified_gmt":"2010-05-17T04:14:46","slug":"maple-14-released","status":"publish","type":"post","link":"https:\/\/walkingrandomly.com\/?p=2646","title":{"rendered":"Maple 14 released"},"content":{"rendered":"<p>Version 14 of <a href=\"http:\/\/www.maplesoft.com\/products\/maple\/\">Maple<\/a> was released a couple of weeks ago and it appears to have some very cool stuff in it.\u00a0 Some of the highlights that stand out for me include<\/p>\n<ul>\n<li>Accelerated linear algebra using graphics cards via <a href=\"http:\/\/www.nvidia.com\/object\/cuda_home_new.html\">NVIDIA&#8217;s CUDA<\/a>. \u00a0\u00a0 Maple&#8217;s advertising blurb says that they have implemented matrix multiplication and that this will help speed up many linear algebra routines since this is such a fundamental operation.\u00a0 I think that Maple are the first of the big general purpose mathematical packages to offer direct CUDA integration out of the box and this development is well worth watching.\u00a0 The speed-ups that are possible from CUDA technology are nothing short of astonishing &#8211; hundreds of times in some cases.\u00a0 However, Maplesoft are going to need to add a lot more than matrix multiplication in order for this to be truly useful.\u00a0 A set of fast random number generators would be nice for example (I&#8217;m thinking superfast Monte-carlo simulations &#8211; the finance people would love it).<\/li>\n<li>Maple uses <a href=\"http:\/\/software.intel.com\/en-us\/intel-mkl\/\">Intel&#8217;s Math Kernel Library<\/a> (MKL) for many of its low-level numerical linear algebra routines and this has been updated to version 10.0 in Maple 14.\u00a0 For 32bit windows users this has sped certain operations up quite a lot but it is 64bit Windows users who will really see the benefit since 64bit Maple 13 only used a set of generic <a href=\"http:\/\/en.wikipedia.org\/wiki\/Basic_Linear_Algebra_Subprograms\">BLAS routines<\/a>.\u00a0 The practical upshot is that certain basic linear algebra routines, such as Matrix Multiplication, can be around <strong>10 times faster<\/strong> in 64bit windows Maple 14 compared to the previous version.\u00a0 I couldn&#8217;t find mention of the Linux version.<\/li>\n<li>A shed load of updates to their differential equation solvers including a new numerical routine called the <a href=\"http:\/\/en.wikipedia.org\/wiki\/Cash%E2%80%93Karp_method\">Cash-Karp pair<\/a>.<\/li>\n<li>The Maple toolbox for <a href=\"http:\/\/www.mathworks.com\/\">MATLAB<\/a> is no longer a separate product and is now included with Maple itself.\u00a0 This is great news if you, like me, tend to work with several mathematical packages simultaneously.\u00a0 Of course you need to have a copy of MATLAB installed to make use of this functionality &#8211; you don&#8217;t get a copy of MATLAB for free :)<\/li>\n<li>You can now import MATLAB binary files (compressed and uncompressed) directly into Maple using the ImportMatrix command.<\/li>\n<li>Another product, The Maple-NAG connector, has also been integrated with Maple itself.\u00a0 This allows you to easily call the <a href=\"http:\/\/www.nag.co.uk\/numeric\/cl\/cldescription.asp\">NAG C library<\/a> directly from Maple but, similar to the MATLAB toolbox, you&#8217;ll have to purchase the NAG C library separately to make use of this.<\/li>\n<\/ul>\n<p>As you can see, I tend to favour new features that lead to improved performance or better interoperability with other software packages in the first instance.\u00a0 New mathematics and usability features take a little longer to sink in (for me at least).<\/p>\n<p>I&#8217;ve not got a copy of Maple 14 yet but will try to write more if I upgrade (finances permitting).<\/p>\n<p>For a full list of changes check out <a href=\"http:\/\/www.maplesoft.com\/support\/help\/Maple\/view.aspx?path=updates%2fMaple14%2findex\">Maple&#8217;s online help section<\/a>.<\/p>\n<h3>More on Maple from Walking Randomly<\/h3>\n<ul>\n<li><a href=\"https:\/\/www.walkingrandomly.com\/?p=1563\">Japanese firm buys Maplesoft<\/a><\/li>\n<li><a href=\"https:\/\/www.walkingrandomly.com\/?p=2029\">Simulating Santa using Maple<\/a><\/li>\n<li><a href=\"https:\/\/www.walkingrandomly.com\/?p=1837\">Parallel Programming in Maple<\/a><\/li>\n<\/ul>\n","protected":false},"excerpt":{"rendered":"<p>Version 14 of Maple was released a couple of weeks ago and it appears to have some very cool stuff in it.\u00a0 Some of the highlights that stand out for me include Accelerated linear algebra using graphics cards via NVIDIA&#8217;s CUDA. \u00a0\u00a0 Maple&#8217;s advertising blurb says that they have implemented matrix multiplication and that this [&hellip;]<\/p>\n","protected":false},"author":3,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"jetpack_post_was_ever_published":false,"footnotes":"","jetpack_publicize_message":"","jetpack_publicize_feature_enabled":true,"jetpack_social_post_already_shared":false,"jetpack_social_options":{"image_generator_settings":{"template":"highway","default_image_id":0,"font":"","enabled":false},"version":2}},"categories":[25,4],"tags":[],"class_list":["post-2646","post","type-post","status-publish","format-standard","hentry","category-maple","category-math-software"],"jetpack_publicize_connections":[],"jetpack_featured_media_url":"","jetpack_shortlink":"https:\/\/wp.me\/p3swhs-GG","jetpack_sharing_enabled":true,"_links":{"self":[{"href":"https:\/\/walkingrandomly.com\/index.php?rest_route=\/wp\/v2\/posts\/2646","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/walkingrandomly.com\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/walkingrandomly.com\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/walkingrandomly.com\/index.php?rest_route=\/wp\/v2\/users\/3"}],"replies":[{"embeddable":true,"href":"https:\/\/walkingrandomly.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=2646"}],"version-history":[{"count":7,"href":"https:\/\/walkingrandomly.com\/index.php?rest_route=\/wp\/v2\/posts\/2646\/revisions"}],"predecessor-version":[{"id":2652,"href":"https:\/\/walkingrandomly.com\/index.php?rest_route=\/wp\/v2\/posts\/2646\/revisions\/2652"}],"wp:attachment":[{"href":"https:\/\/walkingrandomly.com\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=2646"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/walkingrandomly.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=2646"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/walkingrandomly.com\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=2646"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}