Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[isambard-macs] Add Intel compilers #181

Draft
wants to merge 1 commit into
base: main
Choose a base branch
from
Draft

Conversation

giordano
Copy link
Member

@giordano giordano commented Jul 7, 2023

Haven't tested thoroughly yet

@giordano giordano requested a review from kaanolgu July 7, 2023 20:33
@kaanolgu
Copy link
Contributor

kaanolgu commented Jul 7, 2023

I got the Babelstream SYCL2020 version working with Spack now on MACS partition and it's a bit complicated wih OneAPI ICPX compiler.
So we need following line at compilers.yaml file first

- compiler:
    spec: oneapi@=2021.4.0
    paths:
      cc: /lustre/software/x86/tools/oneapi-2021.4.0/compiler/2021.4.0/linux/bin/icx
      cxx: /lustre/software/x86/tools/oneapi-2021.4.0/compiler/2021.4.0/linux/bin/icpx
      f77: /lustre/software/x86/tools/oneapi-2021.4.0/compiler/2021.4.0/linux/bin/ifx
      fc: /lustre/software/x86/tools/oneapi-2021.4.0/compiler/2021.4.0/linux/bin/ifx
    flags: {}
    operating_system: rhel8
    target: any
    modules: []
    environment: {}
    extra_rpaths: []

Then the command

    spack install babelstream@develop%oneapi@2021.4.0 +sycl2020

was complaining that

```
SYCLStream2020.h:15:10: fatal error: 'sycl/sycl.hpp' file not found
 98     #include <sycl/sycl.hpp>
 99              ^~~~~~~~~~~~~~~
 100    1 error generated.
```

I found out that the sycl.hpp is in /lustre/software/x86/tools/oneapi-2021.4.0/compiler/2021.4.0/linux/include/sycl/CL/sycl.hpp
So in my own Fork of Babelstream I modified the SYCLStream2020.h file and replaced the following line #include <sycl/sycl.hpp> with #include <sycl/CL/sycl.hpp>

And, it worked with command :

spack install babelstream@sycl2020-path-fix%oneapi@2021.4.0 +sycl2020

You could use my fork but you need to modify your own ~spack/var/spack/repos/builtin/packages/babelstream/package.py file to have the following :

    # homepage = "https://github.com/kaanolgu/BabelStream"
    # url = "https://github.com/UoB-HPC/BabelStream/archive/refs/tags/v4.0.tar.gz"
    git = "https://github.com/kaanolgu/BabelStream.git"
    # version("4.0", sha256="a9cd39277fb15d977d468435eb9b894f79f468233f0131509aa540ffda4f5953")
    version("main", branch="main")
    version("develop", branch="develop")
    version("sycl2020-path-fix", branch="sycl2020-path-fix")

My question would be is it possible to modify or fake this path to be present in "sycl/sycl.hpp", seems like the newer version of the oneapi compilers do not have this path issue but they are not present at isambard defaults and spack install fails to install them on cray system as I mentioned at spack/spack#37997

@kaanolgu
Copy link
Contributor

kaanolgu commented Jul 7, 2023

I will test the compilers you added with babelstream sycl model

@giordano
Copy link
Member Author

Out of curiosity, how did you find the oneAPI compiler on MACS? I get

$ module avail 

----------------------------------------------------------------------------------- /opt/cray/pe/perftools/21.05.0/modulefiles -----------------------------------------------------------------------------------
perftools             perftools-lite        perftools-lite-events perftools-lite-gpu    perftools-lite-hbm    perftools-lite-loops  perftools-preload

-------------------------------------------------------------------------------------------- /opt/cray/pe/modulefiles --------------------------------------------------------------------------------------------
PrgEnv-cray/8.0.0(default)                                        cray-mvapich2/2.3.5(default)                                      craype/2.7.5
atp/3.11.6(default)                                               cray-mvapich2-gdr/2.3.5.mcast.cuda11.1.mofed5.0.gnu8.3.1(default) craype/2.7.7(default)
cce/11.0.2                                                        cray-mvapich2-gdr/2.3.5.mcast.cuda11.2.mofed5.1.gnu8.3.1          craype-dl-plugin-py3/mvapich/21.02.1.3
cce/11.0.4(default)                                               cray-mvapich2_gnu/2.3.5(default)                                  craype-dl-plugin-py3/openmpi/21.02.1.3
cdt/21.02                                                         cray-mvapich2_nogpu/2.3.5(default)                                craypkg-gen/1.3.13
cdt/21.05(default)                                                cray-mvapich2_nogpu_gnu/2.3.5(default)                            craypkg-gen/1.3.15(default)
cray-ccdb/4.10.4(default)                                         cray-mvapich2_noslurm/2.3.5(default)                              gdb4hpc/4.10.6(default)
cray-cti/2.11.6                                                   cray-mvapich2_noslurm_cuda111/2.3.5(default)                      papi/6.0.0.6
cray-cti/2.14.1(default)                                          cray-mvapich2_noslurm_cuda112/2.3.5(default)                      papi/6.0.0.7(default)
cray-fftw/3.3.8.7                                                 cray-mvapich2_noslurm_gnu/2.3.5(default)                          perftools-base/21.02.0
cray-fftw/3.3.8.10(default)                                       cray-mvapich2_noslurm_nogpu/2.3.5(default)                        perftools-base/21.05.0(default)
cray-impi/6                                                       cray-mvapich2_noslurm_nogpu_gnu/2.3.5(default)                    valgrind4hpc/2.10.3(default)
cray-libsci/20.03.1(default)                                      cray-stat/4.7.1
cray-libsci_acc/20.11.1(default)                                  cray-stat/4.11.1(default)

------------------------------------------------------------------------------------------------ /opt/modulefiles ------------------------------------------------------------------------------------------------
cudatoolkit/11.2   gcc/8.1.0(default)

--------------------------------------------------------------------------------------------- /cm/local/modulefiles ----------------------------------------------------------------------------------------------
boost/1.71.0                        cuda-dcgm/2.0.15.1                  ipmitool/1.8.18                     module-info                         pbspro/pbspro/19.2.8.20200925072630
cluster-tools/9.0                   dot                                 lua/5.3.5                           null                                python3
cmd                                 freeipmi/1.6.4                      luajit                              openldap                            python37
cmjob                               gcc/9.2.0                           module-git                          openmpi/mlnx/gcc/64/4.0.3rc4        shared

--------------------------------------------------------------------------------------------- /cm/shared/modulefiles ---------------------------------------------------------------------------------------------
blacs/openmpi/gcc/64/1.1patch03 cuda11.1/profiler/11.1.1        hdf5_18/1.8.21                  intel/ipp/64/2019/5.281         intel/tbb/32/2019/5.281         netcdf/gcc/64/gcc/64/4.7.3
blas/gcc/64/3.8.0               cuda11.1/toolkit/11.1.1         hwloc/1.11.11                   intel/ipp/64/2020/4.304         intel/tbb/32/2020/4.304         netperf/2.7.0
bonnie++/1.98                   cuda11.2/blas/11.2.0            intel/compiler/32/2019/19.0.5   intel/itac/2019/5.041           intel/tbb/64/2019/5.281         openblas/dynamic/0.3.7
cm-pmix3/3.1.4                  cuda11.2/fft/11.2.0             intel/compiler/32/2020/19.1.3   intel/itac/2020/3.036           intel/tbb/64/2020/4.304         openmpi/gcc/64/1.10.7
cuda10.2/blas/10.2.89           cuda11.2/toolkit/11.2.0         intel/compiler/64/2019/19.0.5   intel/mkl/64/2019/5.281         intel-tbb-oss/ia32/2020.3       pgi/64/19.10
cuda10.2/fft/10.2.89            default-environment             intel/compiler/64/2020/19.1.3   intel/mkl/64/2020/4.304         intel-tbb-oss/intel64/2020.3    ucx/1.6.1
cuda10.2/toolkit/10.2.89        fftw3/openmpi/gcc/64/3.3.8      intel/daal/64/2019/5.281        intel/mpi/32/2019/5.281         iozone/3_487
cuda11.1/blas/11.1.1            gdb/8.3.1                       intel/daal/64/2020/4.304        intel/mpi/32/2020/4.304         lapack/gcc/64/3.8.0
cuda11.1/fft/11.1.1             globalarrays/openmpi/gcc/64/5.7 intel/gdb/64/2019/4.281         intel/mpi/64/2019/5.281         mpich/ge/gcc/64/3.3.2
cuda11.1/nsight/11.1.1          hdf5/1.10.1                     intel/gdb/64/2020/0.304         intel/mpi/64/2020/4.304         mvapich2/gcc/64/2.3.2

-------------------------------------------------------------------------------- /opt/cray/pe/craype-targets/default/modulefiles ---------------------------------------------------------------------------------
craype-accel-amd-gfx906   craype-accel-nvidia35     craype-accel-nvidia70     craype-ivybridge          craype-network-opa        craype-x86-naples
craype-accel-amd-gfx908   craype-accel-nvidia52     craype-broadwell          craype-mic-knl            craype-sandybridge        craype-x86-rome
craype-accel-host         craype-accel-nvidia60     craype-haswell            craype-network-infiniband craype-x86-cascadelake    craype-x86-skylake

And also https://gw4-isambard.github.io/docs/user-guide/software.html doesn't mention oneAPI at all 😅

@tomdeakin
Copy link

The compilers are currently available in /projects/bristol/modules/intel-oneapi-2023.1.0

@giordano
Copy link
Member Author

@kaanolgu would you be able to add the new compiler to the environment? I'm not sure I have much time these days, busy with conference preparation and attending 😞

@kaanolgu
Copy link
Contributor

kaanolgu commented Jul 24, 2023 via email

@dcaseGH
Copy link
Collaborator

dcaseGH commented Jul 24, 2023

It's a very minor point, Kaan, but I think that you can install these things with spack on Cray systems, as I did something similar on Archer. The issue that you saw was, I think, that cc --print-search-dirs doesn't have an install: line on Cray systems, but this is only used when reporting an error. It fails for some other reason (and on Isambard it looks like it's failed to report it's failure as well).
Anyway - it's good to see that there are installations somewhere else that can be used. Thanks for all this work.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants