Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Defining inverses for triangular matrices with ArrayLayouts #320

Closed
DanielVandH opened this issue Jun 18, 2024 · 1 comment · Fixed by JuliaLinearAlgebra/ArrayLayouts.jl#236

Comments

@DanielVandH
Copy link
Contributor

DanielVandH commented Jun 18, 2024

This issue is kind of spread out across LazyArrays, InfiniteArrays, and BandedMatrices so hopefully just putting it all here is fine.. Would it make sense to define

Base.inv(A::LinearAlgebra.UpperOrLowerTriangular{<:Any, <:LazyArray}) = ArrayLayouts._inv(MemoryLayout(A), axes(A), A)

for triangular matrices defined from lazy arrays? This would fix some issues related to inverting infinite triangular matrices, since we could then define

ArrayLayouts._inv(::TriangularLayout{<:Any,<:Any,<:LazyArrays.LazyLayouts}, ::Union{NTuple{2, InfiniteArrays.OneToInf}, InfiniteArrays.InfAxes}, A) = ApplyArray(inv, A)

(inside InfiniteArrays.jl) without having to disrupt the existing methods for finite A. Another similar method for BandedMatrices.jl would be to define

Base.inv(A::LinearAlgebra.UpperOrLowerTriangular{<:Any,<:BandedMatrices.AbstractBandedMatrix}) = ArrayLayouts._inv(MemoryLayout(A), axes(A), A)

(no extra _inv method needed since they are defined in the BandedMatrices extension apparently).

For context, I'm trying to make the following all work:

using SemiclassicalOrthogonalPolynomials, ArrayLayouts, ClassicalOrthogonalPolynomials, InfiniteLinearAlgebra, LinearAlgebra, BandedMatrices, LazyArrays, InfiniteArrays
A = UpperTriangular(ApplyArray(exp, [1.0 0.0; 0.0 1.0])) |> inv
C = UpperTriangular(BandedMatrices._BandedMatrix((1:∞)', ∞, 0, 0)) |> inv
B = (SemiclassicalJacobi(2.0, 2.0, 1.0, 1.0) \ SemiclassicalJacobi(2.0, 1.0, 0.0, 1.0)).args[1].args[2] |> inv # .args[1].args[2] gets the AdaptiveCholeskyFactor

which, with these changes, they do:

julia> A = UpperTriangular(ApplyArray(exp, [1.0 0.0; 0.0 1.0]))
2×2 UpperTriangular{Float64, ApplyMatrix{Float64, typeof(exp), Tuple{Matrix{Float64}}}}:
 2.71828  0.0
         2.71828

julia> inv(A) # same as on master - just making sure we don't change the working methods for finite arrays
2×2 Matrix{Float64}:
 0.367879  0.0
 0.0       0.367879

julia> C = UpperTriangular(BandedMatrices._BandedMatrix((1:∞)', ∞, 0, 0))
ℵ₀×ℵ₀ UpperTriangular{Int64, BandedMatrix{Int64, Adjoint{Int64, InfiniteArrays.InfUnitRange{Int64}}, InfiniteArrays.OneToInf{Int64}}} with indices OneToInf()×OneToInf():
 1                                        
   2                                    
     3                                  
       4                                
         5                              
           6                              
             7                          
               8                        
                 9                      
                   10                   
                      11                  
                         12             
                            13          
                                                       

julia> inv(C) # broken on master
inv(ℵ₀×ℵ₀ UpperTriangular{Int64, BandedMatrix{Int64, Adjoint{Int64, InfiniteArrays.InfUnitRange{Int64}}, InfiniteArrays.OneToInf{Int64}}} with indices OneToInf()×OneToInf()) with indices OneToInf()×OneToInf():
 1.0  0.0  0.0       0.0   0.0  0.0       0.0       0.0    
 0.0  0.5  0.0       0.0   0.0  0.0       0.0       0.0
 0.0  0.0  0.333333  0.0   0.0  0.0       0.0       0.0
 0.0  0.0  0.0       0.25  0.0  0.0       0.0       0.0
 0.0  0.0  0.0       0.0   0.2  0.0       0.0       0.0
 0.0  0.0  0.0       0.0   0.0  0.166667  0.0       0.0    
 0.0  0.0  0.0       0.0   0.0  0.0       0.142857  0.0
 0.0  0.0  0.0       0.0   0.0  0.0       0.0       0.125
 0.0  0.0  0.0       0.0   0.0  0.0       0.0       0.0
 0.0  0.0  0.0       0.0   0.0  0.0       0.0       0.0
 0.0  0.0  0.0       0.0   0.0  0.0       0.0       0.0    
 0.0  0.0  0.0       0.0   0.0  0.0       0.0       0.0
 0.0  0.0  0.0       0.0   0.0  0.0       0.0       0.0
                                                         

julia> B = (SemiclassicalJacobi(2.0, 2.0, 1.0, 1.0) \ SemiclassicalJacobi(2.0, 1.0, 0.0, 1.0)).args[1].args[2] 
ℵ₀×ℵ₀ UpperTriangular{Float64, InfiniteLinearAlgebra.AdaptiveCholeskyFactors{Float64, BandedMatrix{Float64, Matrix{Float64}, Base.OneTo{Int64}}, LazyBandedMatrices.SymTridiagonal{Float64, SubArray{Float64, 1, ClassicalOrthogonalPolynomials.CholeskyJacobiData{Float64}, Tuple{Base.Slice{InfiniteArrays.OneToInf{Int64}}, Int64}, false}, SubArray{Float64, 1, ClassicalOrthogonalPolynomials.CholeskyJacobiData{Float64}, Tuple{Base.Slice{InfiniteArrays.OneToInf{Int64}}, Int64}, false}}}} with indices OneToInf()×OneToInf():
 0.68313  0.323669                                      
         0.624188  0.382235                     
                  0.593394  0.410901            
                           0.574777  0.428194   
                                    0.562341  0.439818
                                             0.553451  
                                              
                                              
                                              
                                              
                                                      
                                              
                                              
                                                          

julia> inv(B) # broken on master
inv(ℵ₀×ℵ₀ UpperTriangular{Float64, InfiniteLinearAlgebra.AdaptiveCholeskyFactors{Float64, BandedMatrix{Float64, Matrix{Float64}, Base.OneTo{Int64}}, LazyBandedMatrices.SymTridiagonal{Float64, SubArray{Float64, 1, ClassicalOrthogonalPolynomials.CholeskyJacobiData{Float64}, Tuple{Base.Slice{InfiniteArrays.OneToInf{Int64}}, Int64}, false}, SubArray{Float64, 1, ClassicalOrthogonalPolynomials.CholeskyJacobiData{Float64}, Tuple{Base.Slice{InfiniteArrays.OneToInf{Int64}}, Int64}, false}}}} with indices OneToInf()×OneToInf()) with indices OneToInf()×OneToInf():
 1.46385  -0.759072   0.488957  -0.34955    0.266164  -0.211516  
 0.0       1.60208   -1.03198    0.737752  -0.561761   0.446422
 0.0       0.0        1.68522   -1.20474    0.917352  -0.729004
 0.0       0.0        0.0        1.7398    -1.32477    1.05277
 0.0       0.0        0.0        0.0        1.77828   -1.41317
 0.0       0.0        0.0        0.0        0.0        1.80684   
 0.0       0.0        0.0        0.0        0.0        0.0
 0.0       0.0        0.0        0.0        0.0        0.0
 0.0       0.0        0.0        0.0        0.0        0.0
 0.0       0.0        0.0        0.0        0.0        0.0
 0.0       0.0        0.0        0.0        0.0        0.0       
 0.0       0.0        0.0        0.0        0.0        0.0
 0.0       0.0        0.0        0.0        0.0        0.0
                                                               

Not sure if this is the best way to implement these changes across the packages, but it seems to work. Also possible I've missed an obvious solution somewhere without putting ApplyArray(inv, ...) everywhere when I need it, which would be nice if it could be pointed out for me. I would try and do more extensive testing but Julia is very unusable for me at the minute 🙃.

I think with these changes it wouldn't be possible to do something like inv(B) * inv(C) (which I need) without erroring still (inv is broken on master, but ApplyArray(inv, B) * ApplyArray(inv, C) is broken also), so that might be another issue unless you can see anything to add to this design that might also fix something like that.

@dlfivefifty
Copy link
Member

Let me look into this. Probably we want to overload inv for all LayoutMatrix and their modified versions

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
2 participants