-
Notifications
You must be signed in to change notification settings - Fork 106
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support for block Krylov methods #73
Comments
Yes I think it would be the right thing to do to allow |
Note that if this is done, we may need to require packages like GenericSVD.jl in order to support number types like BigFloat since the SVD is used for the 2-norm and no generic fallback is given in Base. |
Hi, Has anyone made progress on this issue? In my case, I need to solve a 2d PDE with matrix-free method. I represent the solution as a matrix. The Frobenium norm would do for the stopping criterion. |
Just to be sure, I think what you mean is support for custom vector types, right? Block methods are about solving a system AX=B for multiple right-hand sides, in the sense that you solve AX[:, i] = B[:, i] for all i's independently. This is not the same as representing the unknowns in a shape similar to your geometry (then you still have a single rhs). In principle the iterative method is just unaware about the geometry of your domain, so you could just do the transformation from vector -> your internal geometric representation of the unknowns before applying your matrix-free A, and then transforming it back again to vector. And in fact, this should be a no-op, since the underlying data structure should be the same. It's just reinterpreting a vector. |
You are right, my 2d-array is like a vector. It is not meant for solving block systems I though about using For example, if I take |
It's a general problem. See also JuliaLang/julia#25565. In Julia, we have generally associated vector with 1d arrays and linear maps with 2d arrays but, as you mention, it doesn't have to be like that. It just makes things simpler. It looks like people would like a broader vector definition to happen for |
I'm on Julia 0.7 right now, so I can't really test this approximately working Julia 0.6 code: import Base: A_mul_B!
struct MyLinearOp
n::Int
end
function A_mul_B!(y, A::MyLinearOp, x)
fft!(reshape(copy!(y, x), A.n, A.n), 2) # copy x -> y and reshape to a matrix, then do fft; reshape shouldn't copy iirc
y # return the original y, not the reshaped reference to it
end
function example(n = 128)
A = MyLinearOp(n)
x_square = rand(Complex128, n, n)
y_square = similar(x)
A_mul_B!(y_square, A, x_square) # should work
x_linear = rand(Complex128, n * n)
y_linear = similar(x_linear)
A_mul_B!(y_linear, A, x_linear) # should work
# then just call
b = ... # some vector
x = gmres(A, b)
# and transform x back to your matrix-like representation.
end But it might complain you have to implement one or two things, such as eltype, size etc. A better way to work with matrix-free methods is to use https://github.com/Jutho/LinearMaps.jl |
Thank you for your help!! With
Basically, my question is "Can we solve Ax = y" without reshaping and y being a 2d array? |
Sorry, I think I got it now... |
reshaping is just a view so it's free. |
Indeed. There is a related example here at |
Late to the party, but here is some code for testing:
This yields
for the first benchmark, and
for the second. Doesn't seem to be so bad. For when |
thank you!! Sorry for the late reply but this does not seem to work
|
I was wondering if you guys would think it's necessary to support arbitrary sized inputs. For example, instead of just allowing matrix vector, allow matrix matrix in the normal interpretation. This can come up in solving PDEs (see DifferentialEquations.jl). Currently the issue with the CG and GMRES functions are due to their use of the
dot
function, and I could change it up if people think this is the right way to go.The text was updated successfully, but these errors were encountered: