You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
using BFloat16s # 0.5.0using CUDA # 5.2.0
CUDA.zeros(BFloat16, 5)
which fails with
ERROR: MethodError: no method matching get_inference_world(::GPUCompiler.GPUInterpreter)
The function `get_inference_world` exists, but no method is defined for this combination of argument types.
Closest candidates are:
get_inference_world(::REPL.REPLCompletions.REPLInterpreter)
@ REPL ~/.julia/juliaup/julia-1.11.0-alpha2+0.x64.linux.gnu/share/julia/stdlib/v1.11/REPL/src/REPLCompletions.jl:550
get_inference_world(::Core.Compiler.NativeInterpreter)
@ Core compiler/types.jl:402
Adding the same versions of BFloat16s and CUDA to a fresh Julia 1.10 environment, I don't get any errors. I would expect the same for Julia 1.11.
Version info
Details on Julia:
Julia Version 1.11.0-alpha2
Commit 9dfd28ab751 (2024-03-18 20:35 UTC)
Build Info:
Official https://julialang.org/ release
Platform Info:
OS: Linux (x86_64-linux-gnu)
CPU: 16 × 12th Gen Intel(R) Core(TM) i5-12600K
WORD_SIZE: 64
LLVM: libLLVM-16.0.6 (ORCJIT, alderlake)
Threads: 1 default, 0 interactive, 1 GC (on 16 virtual cores)
Environment:
LD_LIBRARY_PATH = /scratch/nvidia/hpc_sdk/Linux_x86_64/22.7/math_libs/lib64:/home/jschulze/.local/lib:
JULIA_PROJECT = @.
Details on CUDA:
CUDA runtime 12.3, artifact installation
CUDA driver 12.3
NVIDIA driver 525.147.5, originally for CUDA 12.0
CUDA libraries:
- CUBLAS: 12.3.4
- CURAND: 10.3.4
- CUFFT: 11.0.12
- CUSOLVER: 11.5.4
- CUSPARSE: 12.2.0
- CUPTI: 21.0.0
- NVML: 12.0.0+525.147.5
Julia packages:
- CUDA: 5.2.0
- CUDA_Driver_jll: 0.7.0+1
- CUDA_Runtime_jll: 0.11.1+0
Toolchain:
- Julia: 1.11.0-alpha2
- LLVM: 16.0.6
1 device:
0: NVIDIA RTX A4000 (sm_86, 15.718 GiB / 15.992 GiB available)
Additional context
Originally, I was trying to verify that I can use BF16 on hardware on the GPU. While digging into the documentation/history, I found that BFloat16s.jl was (originally) only using emulated/software BF16. Using Julia 1.10, some dummy code was running ok, but I am not used to reading the outputs of @code_native and friends, so I kept reading. I could only find a wrapped BF16 GEMM in CUDA.jl but not where it was wired to be used for CuArray{BFloat16}. Starting with Julia 1.11 and BFloat16s 0.5, more of the work is put on LLVM to handle BF16, which I then thought was a necessity to support it on the GPU. This led me to try Julia 1.11-alpha2 and this bug. I hope this is the right place to report; I have never used an alpha version of Julia before.
... if you read this far; how do I verify that my code is actually using BF16 in hardware on the GPU? Thanks in advance for any pointers. 🙂
This is unrelated to BFloat16; 1.11 is just not supported yet.
You can try using #2241, but there's known issues. In any case, the specific issue here is that you need the latest GPUCompiler, which is incompatible with the latest released version of CUDA.jl
Describe the bug
Creating a BF16 array fails on Julia 1.11.
To reproduce
The Minimal Working Example (MWE) for this bug:
which fails with
Full error message
Manifest.toml
Expected behavior
Adding the same versions of BFloat16s and CUDA to a fresh Julia 1.10 environment, I don't get any errors. I would expect the same for Julia 1.11.
Version info
Details on Julia:
Details on CUDA:
Additional context
Originally, I was trying to verify that I can use BF16 on hardware on the GPU. While digging into the documentation/history, I found that BFloat16s.jl was (originally) only using emulated/software BF16. Using Julia 1.10, some dummy code was running ok, but I am not used to reading the outputs of
@code_native
and friends, so I kept reading. I could only find a wrapped BF16 GEMM in CUDA.jl but not where it was wired to be used forCuArray{BFloat16}
. Starting with Julia 1.11 and BFloat16s 0.5, more of the work is put on LLVM to handle BF16, which I then thought was a necessity to support it on the GPU. This led me to try Julia 1.11-alpha2 and this bug. I hope this is the right place to report; I have never used an alpha version of Julia before.... if you read this far; how do I verify that my code is actually using BF16 in hardware on the GPU? Thanks in advance for any pointers. 🙂
Related: #2241, #2291
The text was updated successfully, but these errors were encountered: