Replies: 3 comments 1 reply
-
|
You could try passing a Gaussian operator, like this: That might help it get the peak shape. Perhaps worth trying a skewed Gauss operator as well. Also it might be worth giving it fake data to nail down the asymptotic behavior. For example you could add a data point (x=1e9, y=0) and weight it very strongly so it definitely passes through that point. |
Beta Was this translation helpful? Give feedback.
0 replies
-
|
I might also try L1 loss to see? |
Beta Was this translation helpful? Give feedback.
0 replies
-
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment


Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Dear All
For a wider project I am using LED light sources from thor labs
eg. https://www.thorlabs.com/fiber-coupled-leds?tabName=Overview
I am hoping to do some Bayesian infrerencing with the distributions which are kindle providing as raw data by Thor labs. Normally I would simply put a smoothing filter through one of the spectra and be on my way. However, I thought why not try and get an actual mathematical expression from the intensity distributions from PySR to do this.
It would appear trying to fit any of these uni-modal non guassian curves is very challenging for PySR and I don't really understand why.
By passing removing very small intensity readings (i.e a noise floor) from the data, passing an empirical mean wavelegth and a emprical standard deviation in addtion to the wavelength and the instensity. I do finally get a curve that goes though the peak. but does terrible job on the noise floor. If I leave the nose floor in it seems to over fit the noise floor but ignore the peak.
this is for the LED product number M530F3
Am I doing this in a wrong headed way? This is a code snippet
testx = data_normalized_sub[["wavelength","lambda_bar","sigma"]][:] testy = data_normalized_sub["intensity"][:].to_numpy() model = PySRRegressor( maxsize=40, niterations=1000, populations=3*8, parsimony=1e-6, binary_operators=["+","*","^"], unary_operators=["exp"], elementwise_loss= "LPDistLoss{2}()", constraints={"^":(-1,9)}, # only a constant or a linear term in exponent nested_constraints={ "^": { "^": 1 }, "exp": { "exp":0 , "^": 1} # exponention can be 1 deep (e^x^2 is allowed, but not e^e^x) }, model_selection="best" ) model.fit(testx, testy)If I put the noise floor back in I get this
Beta Was this translation helpful? Give feedback.
All reactions