Releases: kul-optec/AbstractOperators.jl
Releases · kul-optec/AbstractOperators.jl
WaveletOperators-v0.1.0
<description of version>
NFFTOperators-v0.1.0
<description of version>
FFTWOperators-v0.1.0
<description of version>
DSPOperators-v0.1.0
<description of version>
v0.4.0
Breaking Changes:
- Rename
domainTypetodomain_typeandcodomainTypetocodomain_type. - Separate operators with dependencies to new packages inside the repo (
DSPOperators,FFTWOperators) and add new operators in separate packages (NFFTOperators,WaveletOperators). - Some basic definitions are moved to a new package,
OperatorCore, that contains the general functions previously defined onproperties.jl(e.g.,is_linear,is_diagonal). Only those functions are moved to this package that can be defined for matrices.
Additional Changes:
- Introduce opt-out multi-threading to
DiagOpandBroadCast. - Add option to control where scaling occurs in
DFToperator: never (default), forward, backward (this basically isifft), both (scale by square root of scaling factor in both directions). This follows the convention of numpy's FFT. - Define
LinearAlgebra.opnormforAbstractOperators. Add efficient implementation where possible (e.g.,DiagOp,Eye,DFT), and use power iterations otherwise. - Additionally,
estimate_opnormis introduced, which enables control over the maximum number of power iterations and convergence threshold, and uses fewer iterations by default thanopnorm. The main use case is to estimate the Lipsitz constant in the fast forward-backward algorithm. - Optimize
ComposeandScalingby fusing operators when possible (e.g., scaledDiagOpbecomes aDiagOpwith scaled weights, or two adjacentDiagOps get fused when composed).
v0.3.0
AbstractOperators v0.3.0
Closed issues:
- possible test failure in upcoming Julia version 1.5 (#16)
Merged pull requests:
- Install TagBot as a GitHub Action (#14) (@JuliaTagBot)
- Set documenter version, switch to GitHub workflow for CI (#15) (@lostella)
- Fixing spelling mistake 'invertable'->'invertible' (#18) (@eviatarbach)
- Update Project.toml (#20) (@lostella)
v0.2.2
New calculus rules
New calculus rules:
Ax_mul_Bx--> GeneralizesNonLinearComposeAxt_mul_BxAx_mul_BxtHadamardProd--> GeneralizesHadamard
Hadamard & NonLinearCompose will be deprecated in future versions of AbstractOperators.
Goodbye BlockArrays, welcome ArrayPartitions!
v0.2.0 fixed typo in README
Welcome Julia 1.0
Julia 1.0 update (#4) * begin 0.7 transition: updated linearoperators * updated travis and appveyor * fixed LBFGS * fixed LBFGS * fixing linear calculus * fixed warnings linear calc * fixed nonlinear ops * enabled syntax tests - deprecated (.*) with DiagOp * nonlinear calculus working! all test passing :tada: * updated readme and REQUIRE * removed some commented code * updated README and last 0.6 removals