forked from HDFGroup/hdf5
-
Notifications
You must be signed in to change notification settings - Fork 0
/
INSTALL_Autotools.txt
532 lines (403 loc) · 23.4 KB
/
INSTALL_Autotools.txt
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
*************************************************************************
* Installation Instructions for HDF5 using Autotools *
*************************************************************************
Table of Contents
Section I. Preconditions
Section II: Unix and macOS Configuration and Build
Section III: Full installation instructions for source distributions
Section IV: Using the Library
Section V: Windows Configuration and Build
************************************************************************
For help with installing, questions can be posted to the HDF Forum or sent to the HDF Helpdesk:
HDF Forum: https://forum.hdfgroup.org/
HDF Helpdesk: https://help.hdfgroup.org/
========================================================================
I. Preconditions
========================================================================
Obtaining HDF5 source code
1. Create a directory for your development; for example, "myhdfstuff".
2. Obtain HDF5 source from Github
development branch: https://github.com/HDFGroup/hdf5
last release: https://github.com/HDFGroup/hdf5/releases/latest
hdf5-2_"X"_"Y".tar.gz or hdf5-2_"X"_"Y".zip
and put it in "myhdfstuff".
Uncompress the file. There should be a hdf5-2."X"."Y" folder.
========================================================================
II. Unix and macOS Configuration and Build
========================================================================
See RELEASE.txt in the release_notes/ directory for the list of platforms
tested for this release.
Before You Start:
1. Make sure that the zlib library is installed on your system.
2. Optional: Install the Szip version 2.1 library (you may use
Szip 2.0 binaries).
3. Extract the source from the hdf5-2.X.Y.tar file and change
directory to hdf5-2.X.Y.
4. Quick installation
For those who don't like to read ;-) the following steps can be used
to configure, build, test, and install the HDF5 library, header files,
and support programs. For example, to install HDF5 version 2.X.Y at
location /usr/local/hdf5, use the following steps.
$ cd hdf5-2.X.Y
$ ./configure --prefix=/usr/local/hdf5 <more configure_flags>
$ make
$ make check # run test suite.
$ make install
$ make check-install # verify installation.
<more configure_flags> above refers to the configure flags appropriate
to your installation. For example, to install HDF5 with the
Fortran and C++ interfaces and with Szip compression, the
configure line might read as follows:
$ ./configure --prefix=/usr/local/hdf5 --enable-fortran \
--enable-cxx --with-szlib=PATH_TO_SZIP
In this case, PATH_TO_SZIP would be replaced with the path to the
installed location of the Szip library.
========================================================================
III. Full installation instructions for source distributions
========================================================================
1. Unpacking the distribution
The HDF5 source code is distributed in a variety of formats which
can be unpacked with the following commands, each of which creates an
'hdf5-2.X.Y' directory, where 2.X.Y is the HDF5 version numbers.
1.1. Non-compressed tar archive (*.tar)
$ tar xf hdf5-2.X.Y.tar
1.2. Gzip'd tar archive (*.tar.gz)
$ gunzip < hdf5-2.X.Y.tar.gz | tar xf -
Or
$ tar zxf hdf5-2.X.Y.tar.gz
1.3. Bzip'd tar archive (*.tar.bz2)
$ bunzip2 < hdf5-2.X.Y.tar.bz2 | tar xf -
Or
$ tar jxf hdf5-2.X.Y.tar.bz2
2. Source versus build directories
On most systems the build can occur in a directory other than the
source directory, allowing multiple concurrent builds and/or
read-only source code. In order to accomplish this, one should
create a build directory, cd into that directory, and run the
`configure' script found in the source directory (configure
details are below). For example,
$ mkdir built-fortran
$ cd build-fortran
$ ../hdf5-2.X.Y/configure --enable-fortran ...
3. Configuring
HDF5 uses the GNU autoconf system for configuration, which
detects various features of the host system and creates the
Makefiles. On most systems it should be sufficient to say:
$ ./configure
Or
$ sh configure
The configuration process can be controlled through environment
variables, command-line switches, and host configuration files.
For a complete list of switches type:
$ ./configure --help
The host configuration files are located in the `config'
directory and are based on architecture name, vendor name, and/or
operating system which are displayed near the beginning of the
`configure' output. The host config file influences the behavior
of configure by setting or augmenting shell variables.
3.1. Specifying the installation directories
The default installation location is the HDF5 directory created in
the build directory. Typing `make install' will install the HDF5
library, header files, examples, and support programs in hdf5/lib,
hdf5/include, hdf5/doc/hdf5/examples, and hdf5/bin. To use a path
other than hdf5, specify the path with the `--prefix=PATH' switch:
$ ./configure --prefix=/usr/local
If shared libraries are being built (the default), the final
home of the shared library must be specified with this switch
before the library and executables are built.
HDF5 can be installed into a different location than the prefix
specified at configure time; see section 4.6, "Installing HDF5,"
for more details.
3.2. Using an alternate C compiler
By default, configure will look for the C compiler by trying
`gcc' and `cc'. However, if the environment variable "CC" is set
then its value is used as the C compiler. For instance, one would
use the following line to specify the native C compiler on a system
that also has the GNU gcc compiler (users of csh and derivatives
will need to prefix the commands below with `env'):
$ CC=cc ./configure
A parallel version of HDF5 can be built by specifying `mpicc'
as the C compiler. (The `--enable-parallel' flag documented
below is optional in this case.) Using the `mpicc' compiler
will insure that the correct MPI and MPI-IO header files and
libraries are used.
$ CC=/usr/local/mpi/bin/mpicc ./configure
3.3. Additional compilation flags
If additional flags must be passed to the compilation commands,
specify those flags with the CFLAGS variable. For instance,
to enable symbolic debugging of a production version of HDF5, one
might say:
$ CFLAGS=-g ./configure --enable-build-mode=production
3.4. Compiling HDF5 wrapper libraries
One can optionally build the Fortran, C++, and Java interfaces to
the HDF5 C library. By default, these options are disabled. To build
them, specify '--enable-fortran', '--enable-cxx', or '--enable-java',
respectively.
$ ./configure --enable-fortran
$ ./configure --enable-cxx
$ ./configure --enable-java
Configuration will halt if a working Fortran 90 or 95 compiler or
C++ compiler is not found. Currently, the Fortran configure tests
for these compilers in order: f90, pgf90, f95. To use an
alternate compiler specify it with the FC variable:
$ FC=/usr/local/bin/g95 ./configure --enable-fortran
3.5. Specifying other programs
The build system has been tuned for use with GNU make but also
works with other versions of make. If the `make' command runs a
non-GNU version but a GNU version is available under a different
name (perhaps `gmake'), then HDF5 can be configured to use it by
setting the MAKE variable. Note that whatever value is used for
MAKE must also be used as the make command when building the
library:
$ MAKE=gmake ./configure
$ gmake
The `AR' and `RANLIB' variables can also be set to the names of
the `ar' and `ranlib' (or `:') commands to override values
detected by configure.
The HDF5 library, include files, and utilities are installed
during `make install' (described below) with a BSD-compatible
install program detected automatically by configure. If none is
found, the shell script bin/install-sh is used. Configure does not
check that the install script actually works; if a bad install is
detected on your system (e.g., on the ASCI blue machine as of
March 2, 1999) you have two choices:
1. Copy the bin/install-sh program to your $HOME/bin
directory, name it `install', and make sure that $HOME/bin
is searched before the system bin directories.
2. Specify the full path name of the `install-sh' program
as the value of the INSTALL environment variable. Note: do
not use `cp' or some other program in place of install
because the HDF5 makefiles also use the install program to
change file ownership and/or access permissions.
3.6. Specifying other libraries and headers
Configure searches the standard places (those places known by the
systems compiler) for include files and header files. However,
additional directories can be specified by using the CPPFLAGS
and/or LDFLAGS variables:
$ CPPFLAGS=-I/home/robb/include \
LDFLAGS=-L/home/robb/lib \
./configure
HDF5 uses the zlib library to support the HDF5 deflate
data compression filter. Configure searches the standard places
(plus those specified above with the CPPFLAGS and LDFLAGS variables)
for the zlib headers and library. The search can be disabled by
specifying `--without-zlib' or alternate directories can be specified
with `--with-zlib=INCDIR,LIBDIR' or through the CPPFLAGS and LDFLAGS
variables:
$ ./configure --with-zlib=/usr/unsup/include,/usr/unsup/lib
$ CPPFLAGS=-I/usr/unsup/include \
LDFLAGS=-L/usr/unsup/lib \
./configure
HDF5 includes Szip as a predefined compression method (see INSTALL 2.2).
To enable Szip compression, the HDF5 library must be configured
and built using the Szip library:
$ ./configure --with-szlib=PATH_TO_SZIP
3.7. Static versus shared linking
The build process will create static libraries on all systems and
shared libraries on systems that support dynamic linking to a
sufficient degree. Either form of the library may be suppressed by
saying `--disable-static' or `--disable-shared'.
$ ./configure --disable-shared
Shared C++ and Fortran libraries will be built if shared libraries
are enabled.
To build only statically linked executables on platforms which
support shared libraries, use the `--enable-static-exec' flag.
$ ./configure --enable-static-exec
3.8. Optimization versus symbolic debugging
The library can be compiled to provide symbolic debugging support
so it can be debugged with gdb, dbx, ddd, etc., or it can be
compiled with various optimizations. To compile for symbolic
debugging (the default for snapshots), say
`--enable-build-mode=debug'; to compile with optimizations
(the default for supported public releases),
say `--enable-build-mode=production'. For a 'clean slate' configuration
with optimization disabled and nothing turned on,
say `--enable-build-mode=clean'. On some systems the
library can also be compiled for profiling with gprof by saying
`--enable-profiling'.
$ ./configure --enable-build-mode=debug #symbolic debugging
$ ./configure --enable-build-mode=production #optimized code
$ ./configure --enable-build-mode=clean #'clean slate'
$ ./configure --enable-profiling #for use with gprof
Regardless of whether support for symbolic debugging is enabled,
the library can also perform runtime debugging of certain packages
(such as type conversion execution times and extensive invariant
condition checking). To enable this debugging, supply a
comma-separated list of package names to the `--enable-internal-debug'
switch.
Debugging can be disabled by saying `--disable-internal-debug'.
The default debugging level for snapshots is a subset of the
available packages; the default for supported releases is no
debugging (debugging can incur a significant runtime penalty).
$ ./configure --enable-internal-debug=s,t #debug only H5S and H5T
$ ./configure --enable-internal-debug #debug normal packages
$ ./configure --enable-internal-debug=all #debug all packages
$ ./configure --disable-internal-debug #no debugging
HDF5 can also print a trace of all API function calls, their
arguments, and the return values. To enable or disable the
ability to trace the API say `--enable-trace' (the default for
snapthots) or `--disable-trace' (the default for public releases).
The tracing must also be enabled at runtime to see any output.
3.9. Parallel versus serial library
The HDF5 library can be configured to use MPI and MPI-IO for
parallelism on a distributed multi-processor system. Read the
file INSTALL_parallel for detailed information.
The threadsafe and multi-threaded concurrency options and the C++ and
Java interfaces are not compatible with the parallel option.
Unless --enable-unsupported has been specified on the configure line,
the following options must be disabled:
--enable-threadsafe, --enable-concurrency, --enable-cxx, --enable-java
3.10. Threadsafe and multi-threaded concurrency capabilities
The HDF5 library can be configured to be thread-safe (on a very
large scale) with the `--enable-threadsafe' flag to the configure
script. Some platforms may also require the '-with-pthread=INC,LIB'
(or '--with-pthread=DIR') flag to the configure script.
For further information, see:
https://support.hdfgroup.org/releases/hdf5/documentation/gen_topics/Questions+about+thread-safety+and+concurrent+access
The high-level, C++, Fortran and Java interfaces are not compatible
with the thread-safety option because the lock is not hoisted
into the higher-level API calls.
Unless --enable-unsupported has been specified on the configure line,
the following options must be disabled:
--enable-hl, --enable-cxx, --enable-fortran, --enable-java
The multi-threaded concurrency option improves upon the basic
thread-safe capability by providing both threadsafety for API calls
and allowing multi-threaded concurrent execution within the library.
This may be enabled with the '--enable-concurrency' configure option,
which is mutually exclusive with the '--enable-threadsafe' option.
The current list of API routines that are enabled for concurrent
execution by multiple threads is: <none yet>
The high-level, C++, Fortran, and Java interfaces are not compatible
with the multi-threaded concurrency option because the lock is not
hoisted into the higher-level API calls.
Unless --enable-unsupported has been specified on the configure line,
the following options must be disabled:
--enable-hl, --enable-cxx, --enable-fortran, --enable-java
3.11. Backward compatibility
The 2.0.0 version of the HDF5 library can be configured to operate
identically to the v1.14 library with the
--with-default-api-version=v114
configure flag, or identically to the v1.12 library with the
--with-default-api-version=v112
configure flag, or identically to the v1.10 library with the
--with-default-api-version=v110
configure flag, or identically to the v1.8 library with the
--with-default-api-version=v18
configure flag, or identically to the v1.6 library with the
--with-default-api-version=v16
configure flag. This allows existing code to be compiled with the
v2.0 library without requiring immediate changes to the application
source code. For additional configuration options and other details,
see "API Compatibility Macros":
https://hdfgroup.github.io/hdf5/develop/api-compat-macros.html
4. Building
The library, confidence tests, and programs can be built by
specifying:
$ make
Note that if you have supplied some other make command via the MAKE
variable during the configuration step, that same command must be
used here.
When using GNU make, you can add `-j -l6' to the make command to
compile in parallel on SMP machines. Do not give a number after
the `-j' since GNU make will turn it off for recursive invocations
of make.
$ make -j -l6
4. Building doxygen
One can optionally build the doxygen files for the HDF5 C library.
By default, this option is disabled. To build the html files, specify
'--enable-doxygen'.
$ ./configure --enable-doxygen
Configuration will halt if the required applications are not available.
To build:
$ make doxygen
5. Testing
HDF5 comes with various test suites, all of which can be run by
specifying:
$ make check
To run only the tests for the library, change to the `test'
directory before issuing the command. Similarly, tests for the
parallel aspects of the library are in `testpar' and tests for
the support programs are in `tools'.
The `check' consists of two sub-tests, check-s and check-p, which
are for serial and parallel tests, respectively. Since serial tests
and parallel tests must be run with single and multiple processes
respectively, the two sub-tests work nicely for batch systems in
which the number of processes is fixed per batch job. One may submit
one batch job, requesting 1 process, to run all the serial tests by
"make check-s"; and submit another batch job, requesting multiple
processes, to run all the parallel tests by "make check-p".
Temporary files will be deleted by each test when it completes,
but may continue to exist in an incomplete state if the test
fails. To prevent deletion of the files, define the HDF5_NOCLEANUP
environment variable.
The HDF5 tests can take a long time to run on some systems. To perform
a faster (but less thorough) test, set the HDF5TestExpress environment
variable to 2 or 3 (with 3 being the shortest run). To perform a
longer test, set HDF5TestExpress to 0. 3 is the default.
6. Installing HDF5
The HDF5 library, include files, and support programs can be
installed by specifying `make install'. The files are installed under the
directory specified with `--prefix=DIR' (or if not specified, in 'hdf5'
in the top directory of the HDF5 source code). They will be
placed in directories named `lib', `include', and `bin'. The directories,
if not existing, will be created automatically, provided the mkdir command
supports the -p option.
If `make install' fails because the install command at your site
somehow fails, you may use the install-sh that comes with the
source. You will need to run ./configure again.
$ INSTALL="$PWD/bin/install-sh -c" ./configure ...
$ make install
If you want to install HDF5 in a location other than the location
specified by the `--prefix=DIR' flag during configuration (or
instead of the default location, `hdf5'), you can do that
by running the deploy script:
$ bin/deploy NEW_DIR
This will install HDF5 in NEW_DIR. Alternately, you can do this
manually by issuing the command:
$ make install prefix=NEW_DIR
where NEW_DIR is the new directory where you wish to install HDF5.
If you do not use the deploy script, you should run h5redeploy in
NEW_DIR/bin directory. This utility will fix the h5cc, h5fc and
h5c++ scripts to reflect the new NEW_DIR location.
The library can be used without installing it by pointing the
compiler at the `src' and 'src/.libs' directory for include files and
libraries. However, the minimum which must be installed to make
the library publicly available is:
The library:
./src/.libs/libhdf5.a
The public header files:
./src/H5*public.h, ./src/H5public.h
./src/H5FD*.h except ./src/H5FDprivate.h,
./src/H5api_adpt.h
The main header file:
./src/hdf5.h
The configuration information:
./src/H5pubconf.h
The support programs that are useful are:
./tools/h5ls/h5ls (list file contents)
./tools/h5dump/h5dump (dump file contents)
./tools/misc/h5repart (repartition file families)
./tools/misc/h5debug (low-level file debugging)
./tools/h5import/h5import (imports data to HDF5 file)
./tools/h5diff/h5diff (compares two HDF5 files)
========================================================================
IV. Using the Library
========================================================================
For information on using HDF5 see the documentation, tutorials and examples
found here:
https://support.hdfgroup.org/documentation/index.html
A summary of the features included in the built HDF5 installation can be found
in the libhdf5.settings file in the same directory as the static and/or
shared HDF5 libraries.
========================================================================
V. Windows Configuration and Build
========================================================================
See RELEASE.txt in the release_notes/ directory for the list of platforms
tested for this release.
We now recommend that users build, test, and install HDF5 using CMake.
Instructions for building and testing HDF5 using CMake can be found in the
INSTALL_CMake.txt file found in this folder.
For instructions on building and testing an application with HDF5, see the
USING_HDF5_CMake.txt file found in this folder.
Users who want to build and run an application with HDF5 in Visual Studio
without using CMake should consult the USING_HDF5_VS.txt file.