  C RUBY-ON-RAILS MYSQL ASP.NET DEVELOPMENT RUBY .NET LINUX SQL-SERVER REGEX WINDOWS ALGORITHM ECLIPSE VISUAL-STUDIO STRING SVN PERFORMANCE APACHE-FLEX UNIT-TESTING SECURITY LINQ UNIX MATH EMAIL OOP LANGUAGE-AGNOSTIC VB6 MSBUILD # Fitting multiple gaussian using **curve_fit** function from scipy using python 3.x  » python-3.x » Fitting multiple gaussian using **curve_fit** function from scipy using python 3.x

By : KeithKaaos
Date : November 21 2020, 03:00 PM
wish help you to fix your issue It looks like you are trying to call curve_fit nine separate times, and give it a different initial parameter guess by specifying p0=p[i] (which is probably not what your code does, because p is a nested list).
You should make sure that p is a one-dimensional array with 9 elements, and call curve_fit only once. Something like code :
``````p = np.array([list(t) for t in zip(centroid, sigma, height)]).flatten()
popt, pcov = curve_fit(trimodal_gauss,my_x,my_y,p0=p])
`````` ## gaussian fit with scipy.optimize.curve_fit in python with wrong results

By : Rahl
Date : March 29 2020, 07:55 AM
wish help you to fix your issue Your problem is with the initial parameters of the curve_fit. By default, if no other information is given, it will start with an array of 1, but this obviously lead to a radically wrong result. This can be corrected simply by giving a reasonable starting vector. To do this, I start from the estimated mean and standard deviation of your dataset
code :
``````#estimate mean and standard deviation
meam = sum(x * y)
sigma = sum(y * (x - m)**2)
#do the fit!
popt, pcov = curve_fit(gauss_function, x, y, p0 = [1, mean, sigma])
#plot the fit results
plot(x,gauss_function(x, *popt))
#confront with the given data
plot(x,y,'ok')
`````` ## Fitting a 2D Gaussian function using scipy.optimize.curve_fit - ValueError and minpack.error

By : leix
Date : March 29 2020, 07:55 AM
Hope this helps The output of twoD_Gaussian needs to be 1D. What you can do is add a .ravel() onto the end of the last line, like this:
code :
``````def twoD_Gaussian((x, y), amplitude, xo, yo, sigma_x, sigma_y, theta, offset):
xo = float(xo)
yo = float(yo)
a = (np.cos(theta)**2)/(2*sigma_x**2) + (np.sin(theta)**2)/(2*sigma_y**2)
b = -(np.sin(2*theta))/(4*sigma_x**2) + (np.sin(2*theta))/(4*sigma_y**2)
c = (np.sin(theta)**2)/(2*sigma_x**2) + (np.cos(theta)**2)/(2*sigma_y**2)
g = offset + amplitude*np.exp( - (a*((x-xo)**2) + 2*b*(x-xo)*(y-yo)
+ c*((y-yo)**2)))
return g.ravel()
``````
``````# Create x and y indices
x = np.linspace(0, 200, 201)
y = np.linspace(0, 200, 201)
x, y = np.meshgrid(x, y)

#create data
data = twoD_Gaussian((x, y), 3, 100, 100, 20, 40, 0, 10)

# plot twoD_Gaussian data generated above
plt.figure()
plt.imshow(data.reshape(201, 201))
plt.colorbar()
``````
``````# add some noise to the data and try to fit the data generated beforehand
initial_guess = (3,100,100,20,40,0,10)

data_noisy = data + 0.2*np.random.normal(size=data.shape)

popt, pcov = opt.curve_fit(twoD_Gaussian, (x, y), data_noisy, p0=initial_guess)
``````
``````data_fitted = twoD_Gaussian((x, y), *popt)

fig, ax = plt.subplots(1, 1)
ax.hold(True)
ax.imshow(data_noisy.reshape(201, 201), cmap=plt.cm.jet, origin='bottom',
extent=(x.min(), x.max(), y.min(), y.max()))
ax.contour(x, y, data_fitted.reshape(201, 201), 8, colors='w')
plt.show()
`````` ## Fitting a vector function with curve_fit in Scipy

By : RUrlus
Date : March 29 2020, 07:55 AM
I wish did fix the issue. I think what you're doing is perfectly fine from an efficiency stand point. I'll try to look at the implementation and come up with something more quantitative, but for the time being here is my reasoning.
What you're doing during curve fitting is optimizing the parameters (a,b) such that
code :
``````res = sum_i |f(x_i; a,b)-y_i|^2
``````
``````|f(x_i; a,b)-y_i|^2 == sum_k (f(x_i; a,b)[k]-y_i[k])^2
`````` ## Python: Data fitting with scipy.optimize.curve_fit with sigma = 0

By : Kity_Pei
Date : March 29 2020, 07:55 AM
Any of those help Why not just drop the variable? If it has zero variance it cannot contribute in any meaningful way to your analysis. ## Improving Gaussian fitting using ***curve_fit*** from scipy and python 3.x

By : Marc Jordan
Date : March 29 2020, 07:55 AM
seems to work fine If you use the lmfit module (https://github.com/lmfit/lmfit-py), you can easily put bounds on the centroids of your Gaussian functions, or even fix them. Lmfit also makes it easy to build up multi-peak models.
You didn't give a complete example or link to your data, but a fit with lmfit to your data might look like this:
code :
``````import numpy as np
from lmfit import GaussianModel
my_x=data[:,0]
my_y=data[:,1]

model = ( GaussianModel(prefix='p1_') +
GaussianModel(prefix='p2_') +
GaussianModel(prefix='p3_') )

params = model.make_params(p1_amplitude=100, p1_sigma=2, p1_center=2262,
p2_amplitude=100, p2_sigma=2, p2_center=2269,
p3_amplitude=100, p3_sigma=2, p3_center=2276,
)

# set boundaries on the Gaussian Centers:
params['p1_center'].min = 2260
params['p1_center'].max = 2264

params['p2_center'].min = 2267
params['p2_center'].max = 2273

params['p3_center'].min = 2274
params['p3_center'].max = 2279

# or you could just fix one of the centroids like this:
params['p3_center'].vary = False

# if needed, you could force all the sigmas to be the same value
# or related by simple mathematical expressions
params['p2_sigma'].expr = 'p1_sigma'
params['p3_sigma'].expr = '2*p1_sigma'

# fit this model to data:
result  = model.fit(my_y, params, x=my_x)

# print results
print(result.fit_report())

# evaluate individual gaussian components:
peaks = model.eval_components(params=result.params, x=my_x)

# plot results:
plt.plot(my_x, my_y, label='data')
plt.plot(my_x, result.best_fit, label='best fit')
plt.plot(my_x, peaks['p1_'])
plt.plot(my_x, peaks['p2_'])
plt.plot(my_x, peaks['p3_'])
plt.show()
`````` 