--
Dieter Palme
dl7udp
mailto:dl7...@darc.de
mailto:dieter...@t-online.de
Get in the statistics add on
In[1]:=
Needs["Statistics`NormalDistribution`"]
Create some test values of (xi, yi)
In[2]:=
r = 3.3; x0 = 1.2; y0 = 3.4; sigmar = .3; n = 50;
In[3]:=
t = Map[{x0 + (r + #[[1]])Sin[#[[2]]], y0 + (r + #[[1]])Cos[#[[2]]]} &,
Transpose[{RandomArray[NormalDistribution[0, sigmar], n],
Table[Random[Real, 2\[Pi]], {n}]}]];
take a look at them
In[4]:=
ListPlot[t, PlotRange -> All, AspectRatio -> Automatic];
Here I may be making unwaranted assumptions that they are evenly distributed about the circle
Find x0 and y0 here identified as x1 and y1
In[5]:=
{x1, y1} = {Mean[First /@ t], Mean[Last /@ t]}
Out[5]=
{1.21852, 3.15013}
using x1 and y1 convert the list to a list of radii
In[6]:=
rlist = Sqrt[(#[[1]] - x1)^2 + (#[[2]] - y1)^2] & /@ t;
Find the mean and standard deviation of the list
In[7]:=
Mean[rlist]
Out[7]=
3.26131
In[8]:=
StandardDeviation[rlist]
Out[8]=
0.444839
Best regards
Dieter
--
Dieter Palme
dl7udp
mailto:dl7...@darc.de
mailto:dieter...@t-online.de
"Nigel King" <ki...@dircon.co.uk> schrieb im Newsbeitrag
news:augumr$12b$1...@smc.vnet.net...
Are you saying that you do have some code running in Mathematica, it just
isn't giving a result that seems ideal?
If so, how about posting the code, the sample data, and the output?
Perhaps someone can see an improvement to make...
-- Steve S. Shaw (a.k.a. ToolmakerSteve)
I'll use Nigel's code to set things up:
r0 = 3.3; x0 = 1.2; y0 = 3.4; sigmar = .3; n = 50;
t = Map[{x0 + (r0 + #[[1]])Sin[#[[2]]], y0 + (r0 + #[[1]])Cos[#[[2]]]}
&,
Transpose[{RandomArray[NormalDistribution[0, sigmar], n],
Table[Random[Real, 2š], {n}]}]];
dataPlot = ListPlot[t, PlotRange -> All, AspectRatio -> Automatic];
{x1, y1} = {Mean[First /@ t], Mean[Last /@ t]}
rlist = Sqrt[(#[[1]] - x1)^2 + (#[[2]] - y1)^2] & /@ t;
r1 = Mean[rlist]
Here's a quadratic error function:
Clear[a, b, r]
error[a_, b_, r_, {x_, y_}] =
((x - a)^2 + (y - b)^2 - r^2)^2
error[a_, b_, r_, s:{{_, _}..}] :=
Expand[Tr[(error[a, b, r, #1] & ) /@ s]]
Set the derivatives equal to zero, solve, select for positive real r
values, and choose the more reasonable alternative (the Last solution,
in the examples I've done):
D[error[a, b, r, t], #] == 0 & /@ {a, b, r}
Chop@NSolve[%, {a, b, r}];
Select[%, Positive[r /. #] &]
{x2, y2, r2} = {a, b, r} /. Last@%
Plot the data and the three circles (Red for the defining circle, Blue
for Nigel's solution, and Gold for the least-squares solution):
<<Graphics`Colors`
Show[
dataPlot,
Graphics[
{ Red,
Circle[{x0, y0}, r0],
Blue,
Circle[{x1, y1}, r1],
Gold,
Circle[{x2, y2}, r2]
}],
PlotRange -> All,
AspectRatio -> Automatic
]
I think the Gold circle will usually be a better fit than the Blue one.
Bobby
Let's say that sample is the list with your points in the form:
sample={{x1,y1},...}
Also:
n=Length[sample]
norm[v_]:=(v.v)^1/2
You can find a decent approximation of {x0,y0} by using:
center=(Plus@@@Transpose[sample])/n
Now you can get a decent approximation of the radius by using:
radius=Plus@@(norm[center-#]&/@sample)/n
Let's estimate the total error for an arbitrary c(center) and
r(radius):
error[c_,r_][sample_]:=Plus@@((norm[c-#]^2-r^2)^2&/@sample)
You can check our approximation by evaluating
error[center,radius][sample]
It is not that good, right?
OK, to get a better approximation we can try to minimize the total
error numerically, using center and radius as initial values:
FindMinimum[error[{x0,y0},r][sample],{x0,center[[1]]},{y0,center[[2]]},{r,radius}]
This returns a list of the form:
{min_,{x0->_,y0->_,r->_}}
To begin with, min/n is (roughly) sigma_r. I can't recall the exact
definition right now...
Anyway, the {x0,y0} and r returned are much better fitted to your
data.
Visualize it:
Show[Graphics[{Circle[{0, 2}, 1], {Hue[0], Circle[center, radius]},
{Hue[.7],
Circle[{x0, y0}, r] /. sol}, Point /@ sample}],
AspectRatio -> Automatic];
Hope that helped,
Orestis
I don't know what sigma_r is but presumably what you want to estimate
are radius and coordinates of the center. The best approach I know
starts with the observation that it can be cast as a linear least
squares problem. To do this you write the circle equation as
x^2+y^2 = a*x+b*y+c where you want to find the parameters {a,b,c}. So we
set up a matrix equation using the given data as follows.
{{x1,y1,1},{x2,y2,1},...,{xn,yn,1}} . {a,b,c} =
{x1^2+y1^2,x2^2+y2^2,...,xn^2+yn^2}
This can be solved in least squares by a number of techniques using e.g.
singular values or Cholesky or Q-R decomposition. For some code details
see the Further Examples in the Mathematica Help Browser under each of
those various functions. In the code below I use the QRDecomposition
approach but the others are substantially similar.
The next step is to convert to the form (x-x0)^2 + (y-y0)^2 == r^2. This
is readily accomplished once we have the parameters {a,b,c} from the
original form of the equation. Note that this readily extends to higher
dimension so the code does not insist that the data be in two
dimensions. The code below returns a pair of the form {radius, center
coordinates}. We could make the code a bit more terse but it is already
reasonably short.
hypersphereFit[data_List] := Module[
{mat, rhs, qq, rr, params, cen},
mat = Map[Append[#,1]&, data];
rhs = Map[#.#&, data];
{qq, rr} = QRDecomposition[mat];
params = LinearSolve[rr, qq.rhs];
cen = Drop[params,-1]/2;
{Sqrt[Last[params] + cen.cen], cen}
]
We generate some data to check this. It is intentionally made to lie
near just a part of the circle; we keep the angle between 0 and 1 in
radian measure and we use random noise so that it does not lie exactly
on the circle. The "base" circle has radius 2 and center {1,-3}.
points = Table[theta=Random[];
{2*Cos[theta]+1+.05*Random[],2*Sin[theta]-3+.05*Random[]}, {30}]
You can readily chec that the points lie "almost" on a circle.
ListPlot[points]
Now we recover our approximate circle.
In[55]:= hypersphereFit[points]
Out[55]= {1.95099, {1.06397, -2.9453}}
There was a prior discussion on this topic in MathGroup around April or
May of 1995. I'm not sure which parts of that thread can be found in a
web search but I do recall that some of the techniques presented in
responses made use of linear least squares.
Daniel Lichtblau
Wolfram Research
"Dieter Palme" <dieter...@t-online.de> schrieb im Newsbeitrag
news:aueij4$522$1...@smc.vnet.net...
For z = StartRow(w) To EndRow(w)
x = Sheet1.Cells(z, xCol).Value 'each xi
y = Sheet1.Cells(z, yCol).Value 'each yi
Sxxx = Sxxx + x ^ 3
Syyy = Syyy + y ^ 3
Sxxy = Sxxy + (x ^ 2 * y)
Sxyy = Sxyy + (x * y ^ 2)
Sxx = Sxx + x ^ 2
Syy = Syy + y ^ 2
Sxy = Sxy + (x * y)
Sx = Sx + x
Sy = Sy + y
Next z
A = N * Sxx - Sx ^ 2
B = N * Sxy - Sx * Sy
C = N * Syy - Sy ^ 2
D = 0.5 * (N * Sxyy - Sx * Syy + N * Sxxx - Sx * Sxx)
E = 0.5 * (N * Sxxy - Sy * Sxx + N * Syyy - Sy * Syy)
Xc = (D * C - B * E) / (A * C - B ^ 2)
Yc = (A * E - B * D) / (A * C - B ^ 2)
R = Sqr((Sxx - 2 * Xc * Sx + N * Xc * Xc + Syy - 2 * Yc * Sy +
N * Yc * Yc) / N)
to be really useful
given all xi, yi on a circle it spits back the radius (R) and center
(Xc, Yc)
more info at
http://groups.google.com/groups?hl=en&lr=&ie=UTF-8&threadm=35f59745.37468454%40news.wwnet.net&rnum=1&prev=/groups%3Fhl%3Den%26lr%3D%26ie%3DISO-8859-1%26q%3Dbest%2Bfit%2Bcircle%26meta%3D
from a IEEE PAMI paper by Thomas and Chan
http://www.cs.bsu.edu/homepages/math/people/regfac/kjones/circles.pdf
'A Few Methods for Fitting Circles to Data' by Dale Umbach and Kerry
N. Jones from Ball State university
melanie