{"id":5180,"date":"2013-12-06T13:20:25","date_gmt":"2013-12-06T12:20:25","guid":{"rendered":"http:\/\/www.walkingrandomly.com\/?p=5180"},"modified":"2013-12-06T16:12:13","modified_gmt":"2013-12-06T15:12:13","slug":"simple-nonlinear-least-squares-curve-fitting-in-mathematica","status":"publish","type":"post","link":"https:\/\/walkingrandomly.com\/?p=5180","title":{"rendered":"Simple nonlinear least squares curve fitting in Mathematica"},"content":{"rendered":"<p>A question I get asked a lot is \u2018How can I do <a href=\"http:\/\/en.wikipedia.org\/wiki\/Non-linear_least_squares\">nonlinear least squares<\/a> curve fitting in X?\u2019 where X might be MATLAB, Mathematica or a whole host of alternatives. \u00a0Since this is such a common query, I thought I\u2019d write up how to do it for a very simple problem in several systems that I\u2019m interested in<\/p>\n<p>This is the Mathematica version. For other versions,see the list below<\/p>\n<ul>\n<li><a href=\"https:\/\/www.walkingrandomly.com\/?p=5218\">Simple nonlinear least squares curve fitting in Maple<\/a><\/li>\n<li><a href=\"https:\/\/www.walkingrandomly.com\/?p=5181\">Simple nonlinear least squares curve fitting in\u00a0Julia<\/a><\/li>\n<li><a href=\"https:\/\/www.walkingrandomly.com\/?p=5196\">Simple nonlinear least squares curve fitting in MATLAB<\/a><\/li>\n<li><a href=\"https:\/\/www.walkingrandomly.com\/?p=5215\">Simple nonlinear least squares curve fitting in Python<\/a><\/li>\n<li><a href=\"https:\/\/www.walkingrandomly.com\/?p=5254\">Simple nonlinear least squares curve fitting in R<\/a><\/li>\n<\/ul>\n<p><strong>The problem<\/strong><\/p>\n<p>You have the following 10 data points<\/p>\n<pre>xdata = -2,-1.64,-1.33,-0.7,0,0.45,1.2,1.64,2.32,2.9\r\nydata = 0.699369,0.700462,0.695354,1.03905,1.97389,2.41143,1.91091,0.919576,-0.730975,-1.42001<\/pre>\n<p>and you&#8217;d like to fit the function<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/walkingrandomly.com\/wp-content\/ql-cache\/quicklatex.com-17f46b025dd91e024cf2dc04211e19ba_l3.png\" class=\"ql-img-inline-formula quicklatex-auto-format\" alt=\"&#32;&#70;&#40;&#112;&#95;&#49;&#44;&#112;&#95;&#50;&#44;&#120;&#41;&#32;&#61;&#32;&#112;&#95;&#49;&#32;&#92;&#99;&#111;&#115;&#40;&#112;&#95;&#50;&#32;&#120;&#41;&#43;&#112;&#95;&#50;&#32;&#92;&#115;&#105;&#110;&#40;&#112;&#95;&#49;&#32;&#120;&#41;&#32;\" title=\"Rendered by QuickLaTeX.com\" height=\"18\" width=\"297\" style=\"vertical-align: -4px;\"\/><\/p>\n<p>using nonlinear least squares. \u00a0You&#8217;re starting guesses for the parameters are p1=1 and P2=0.2<\/p>\n<p>For now, we are primarily interested in the following results:<\/p>\n<ul>\n<li>The fit parameters<\/li>\n<li>Sum of squared residuals<\/li>\n<\/ul>\n<p>Future updates of these posts will show how to get other results such as confidence intervals. Let me know what you are most interested in.<\/p>\n<p><strong>Mathematica Solution using FindFit<\/strong><br \/>\nFindFit is the basic nonlinear curve fitting routine in Mathematica<\/p>\n<pre>xdata={-2,-1.64,-1.33,-0.7,0,0.45,1.2,1.64,2.32,2.9};\r\nydata={0.699369,0.700462,0.695354,1.03905,1.97389,2.41143,1.91091,0.919576,-0.730975,-1.42001};\r\n\r\n(*Mathematica likes to have the data in the form {{x1,y1},{x2,y2},..}*)\r\ndata = Partition[Riffle[xdata, ydata], 2];\r\n\r\nFindFit[data, p1*Cos[p2 x] + p2*Sin[p1 x], {{p1, 1}, {p2, 0.2}}, x]\r\n\r\nOut[4]:={p1-&gt;1.88185,p2-&gt;0.70023}<\/pre>\n<p><strong>Mathematica Solution using NonlinearModelFit<\/strong><br \/>\nYou can get a lot more information about the fit using the NonLinearModelFit function<\/p>\n<pre>(*Set up data as before*)\r\nxdata={-2,-1.64,-1.33,-0.7,0,0.45,1.2,1.64,2.32,2.9};\r\nydata={0.699369,0.700462,0.695354,1.03905,1.97389,2.41143,1.91091,0.919576,-0.730975,-1.42001};\r\ndata = Partition[Riffle[xdata, ydata], 2];\r\n\r\n(*Create the NonLinearModel object*)\r\nnlm = NonlinearModelFit[data, p1*Cos[p2 x] + p2*Sin[p1 x], {{p1, 1}, {p2, 0.2}}, x];<\/pre>\n<p>The NonLinearModel object contains many properties that may be useful to us. Here&#8217;s how to list them all<\/p>\n<pre>nlm[\"Properties\"]\r\n\r\nOut[10]= {\"AdjustedRSquared\", \"AIC\", \"AICc\", \"ANOVATable\", \\\r\n\"ANOVATableDegreesOfFreedom\", \"ANOVATableEntries\", \"ANOVATableMeanSquares\", \\\r\n\"ANOVATableSumsOfSquares\", \"BestFit\", \"BestFitParameters\", \"BIC\", \\\r\n\"CorrelationMatrix\", \"CovarianceMatrix\", \"CurvatureConfidenceRegion\", \"Data\", \\\r\n\"EstimatedVariance\", \"FitCurvatureTable\", \"FitCurvatureTableEntries\", \\\r\n\"FitResiduals\", \"Function\", \"HatDiagonal\", \"MaxIntrinsicCurvature\", \\\r\n\"MaxParameterEffectsCurvature\", \"MeanPredictionBands\", \\\r\n\"MeanPredictionConfidenceIntervals\", \"MeanPredictionConfidenceIntervalTable\", \\\r\n\"MeanPredictionConfidenceIntervalTableEntries\", \"MeanPredictionErrors\", \\\r\n\"ParameterBias\", \"ParameterConfidenceIntervals\", \\\r\n\"ParameterConfidenceIntervalTable\", \\\r\n\"ParameterConfidenceIntervalTableEntries\", \"ParameterConfidenceRegion\", \\\r\n\"ParameterErrors\", \"ParameterPValues\", \"ParameterTable\", \\\r\n\"ParameterTableEntries\", \"ParameterTStatistics\", \"PredictedResponse\", \\\r\n\"Properties\", \"Response\", \"RSquared\", \"SingleDeletionVariances\", \\\r\n\"SinglePredictionBands\", \"SinglePredictionConfidenceIntervals\", \\\r\n\"SinglePredictionConfidenceIntervalTable\", \\\r\n\"SinglePredictionConfidenceIntervalTableEntries\", \"SinglePredictionErrors\", \\\r\n\"StandardizedResiduals\", \"StudentizedResiduals\"}<\/pre>\n<p>Let&#8217;s extract the fit parameters, 95% confidence levels and residuals<\/p>\n<pre>{params, confidenceInt, res} = \r\n nlm[{\"BestFitParameters\", \"ParameterConfidenceIntervals\", \"FitResiduals\"}]\r\n\r\nOut[22]= {{p1 -&gt; 1.88185, \r\n  p2 -&gt; 0.70023}, {{1.8186, 1.9451}, {0.679124, \r\n   0.721336}}, {-0.0276906, -0.0322944, -0.0102488, 0.0566244, \r\n  0.0920392, 0.0976307, 0.114035, 0.109334, 0.0287154, -0.0700442}}<\/pre>\n<p>The parameters are given as replacement rules. Here, we convert them to pure numbers<\/p>\n<pre>{p1, p2} = {p1, p2} \/. params\r\n\r\nOut[38]= {1.88185,0.70023}<\/pre>\n<p>Although only a few decimal places are shown, p1 and p2 are stored in full double precision. You can see this by converting to InputForm<\/p>\n<pre>InputForm[{p1, p2}]\r\n\r\nOut[43]\/\/InputForm=\r\n{1.8818508498053645, 0.7002298171759191}<\/pre>\n<p>Similarly, let&#8217;s look at the 95% confidence interval, extracted earlier, in full precision<\/p>\n<pre>confidenceInt \/\/ InputForm\r\n\r\nOut[44]\/\/InputForm=\r\n{{1.8185969887307214, 1.9451047108800077}, \r\n {0.6791239458086734, 0.7213356885431649}}<\/pre>\n<p>Calculate the sum of squared residuals<\/p>\n<pre>resnorm = Total[res^2]\r\n\r\nOut[45]= 0.0538127<\/pre>\n<p><strong>Notes<\/strong><br \/>\nI used Mathematica 9 on Windows 7 64bit to perform these calculations<\/p>\n","protected":false},"excerpt":{"rendered":"<p>A question I get asked a lot is \u2018How can I do nonlinear least squares curve fitting in X?\u2019 where X might be MATLAB, Mathematica or a whole host of alternatives. \u00a0Since this is such a common query, I thought I\u2019d write up how to do it for a very simple problem in several systems [&hellip;]<\/p>\n","protected":false},"author":3,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"jetpack_post_was_ever_published":false,"footnotes":"","jetpack_publicize_message":"","jetpack_publicize_feature_enabled":true,"jetpack_social_post_already_shared":true,"jetpack_social_options":{"image_generator_settings":{"template":"highway","default_image_id":0,"font":"","enabled":false},"version":2}},"categories":[4,8,7],"tags":[],"class_list":["post-5180","post","type-post","status-publish","format-standard","hentry","category-math-software","category-mathematica","category-programming"],"jetpack_publicize_connections":[],"jetpack_featured_media_url":"","jetpack_shortlink":"https:\/\/wp.me\/p3swhs-1ly","jetpack_sharing_enabled":true,"_links":{"self":[{"href":"https:\/\/walkingrandomly.com\/index.php?rest_route=\/wp\/v2\/posts\/5180","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/walkingrandomly.com\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/walkingrandomly.com\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/walkingrandomly.com\/index.php?rest_route=\/wp\/v2\/users\/3"}],"replies":[{"embeddable":true,"href":"https:\/\/walkingrandomly.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=5180"}],"version-history":[{"count":33,"href":"https:\/\/walkingrandomly.com\/index.php?rest_route=\/wp\/v2\/posts\/5180\/revisions"}],"predecessor-version":[{"id":5299,"href":"https:\/\/walkingrandomly.com\/index.php?rest_route=\/wp\/v2\/posts\/5180\/revisions\/5299"}],"wp:attachment":[{"href":"https:\/\/walkingrandomly.com\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=5180"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/walkingrandomly.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=5180"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/walkingrandomly.com\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=5180"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}