Skip to content
Snippets Groups Projects
Commit 8ce62ba5 authored by Adam J. Stewart's avatar Adam J. Stewart Committed by scheibelp
Browse files

Add documentation on build systems (#5015)

Spack provides a number of classes based on commonly-used build systems
that users can extend when writing packages; the classes provide functionality
to perform the actions relevant to the build system (e.g. running "configure" for
an Autotools-based package). This adds documentation for classes supporting the
following build systems:

* Makefile
* Autotools
* CMake
* QMake
* SCons
* Waf

This includes build systems for managing extensions of the following packages:

* Perl
* Python
* R
* Octave

This also adds documentation on implementing packages that use a custom build
system (e.g. Perl/CMake).

Spack also provides extendable classes which aggregate functionality for related
sets of packages, e.g. those using CUDA. Documentation is added for
CudaPackage.
parent 25062d0b
No related branches found
No related tags found
No related merge requests found
Showing
with 3941 additions and 0 deletions
.. _build-systems:
=============
Build Systems
=============
Spack defines a number of classes which understand how to use common
`build systems <https://en.wikipedia.org/wiki/List_of_build_automation_software>`_
(Makefiles, CMake, etc.). Spack package definitions can inherit these
classes in order to streamline their builds.
This guide provides information specific to each particular build system.
It assumes that you've read the :ref:`packaging-guide` and expands
on these ideas for each distinct build system that Spack supports:
.. toctree::
:maxdepth: 1
:caption: Make-based
build_systems/makefilepackage
.. toctree::
:maxdepth: 1
:caption: Make-incompatible
build_systems/sconspackage
build_systems/wafpackage
.. toctree::
:maxdepth: 1
:caption: Build-script generation
build_systems/autotoolspackage
build_systems/cmakepackage
build_systems/qmakepackage
.. toctree::
:maxdepth: 1
:caption: Language-specific
build_systems/octavepackage
build_systems/perlpackage
build_systems/pythonpackage
build_systems/rpackage
build_systems/rubypackage
.. toctree::
:maxdepth: 1
:caption: Other
build_systems/cudapackage
build_systems/intelpackage
build_systems/custompackage
For reference, the :py:mod:`Build System API docs <spack.build_systems>`
provide a list of build systems and methods/attributes that can be
overridden. If you are curious about the implementation of a particular
build system, you can view the source code by running:
.. code-block:: console
$ spack edit --build-system autotools
This will open up the ``AutotoolsPackage`` definition in your favorite
editor. In addition, if you are working with a less common build system
like QMake, SCons, or Waf, it may be useful to see examples of other
packages. You can quickly find examples by running:
.. code-block:: console
$ cd var/spack/repos/builtin/packages
$ grep -l QMakePackage */package.py
You can then view these packages with ``spack edit``.
This guide is intended to supplement the
:py:mod:`Build System API docs <spack.build_systems>` with examples of
how to override commonly used methods. It also provides rules of thumb
and suggestions for package developers who are unfamiliar with a
particular build system.
This diff is collapsed.
.. _autotoolspackage:
----------------
AutotoolsPackage
----------------
Autotools is a GNU build system that provides a build-script generator.
By running the platform-independent ``./configure`` script that comes
with the package, you can generate a platform-dependent Makefile.
^^^^^^
Phases
^^^^^^
The ``AutotoolsPackage`` base class comes with the following phases:
#. ``autoreconf`` - generate the configure script
#. ``configure`` - generate the Makefiles
#. ``build`` - build the package
#. ``install`` - install the package
Most of the time, the ``autoreconf`` phase will do nothing, but if the
package is missing a ``configure`` script, ``autoreconf`` will generate
one for you.
The other phases run:
.. code-block:: console
$ ./configure --prefix=/path/to/installation/prefix
$ make
$ make check # optional
$ make install
$ make installcheck # optional
Of course, you may need to add a few arguments to the ``./configure``
line.
^^^^^^^^^^^^^^^
Important files
^^^^^^^^^^^^^^^
The most important file for an Autotools-based package is the ``configure``
script. This script is automatically generated by Autotools and generates
the appropriate Makefile when run.
.. warning::
Watch out for fake Autotools packages!
Autotools is a very popular build system, and many people are used to the
classic steps to install a package:
.. code-block:: console
$ ./configure
$ make
$ make install
For this reason, some developers will write their own ``configure``
scripts that have nothing to do with Autotools. These packages may
not accept the same flags as other Autotools packages, so it is
better to use the ``Package`` base class and create a
:ref:`custom build system <custompackage>`. You can tell if a package
uses Autotools by running ``./configure --help`` and comparing the output
to other known Autotools packages. You should also look for files like:
* ``configure.ac``
* ``configure.in``
* ``Makefile.am``
Packages that don't use Autotools aren't likely to have these files.
^^^^^^^^^^^^^^^^^^^^^^^^^
Build system dependencies
^^^^^^^^^^^^^^^^^^^^^^^^^
Whether or not your package requires Autotools to install depends on
how the source code is distributed. Most of the time, when developers
distribute tarballs, they will already contain the ``configure`` script
necessary for installation. If this is the case, your package does not
require any Autotools dependencies.
However, a basic rule of version control systems is to never commit
code that can be generated. The source code repository itself likely
does not have a ``configure`` script. Developers typically write
(or auto-generate) a ``configure.ac`` script that contains configuration
preferences and a ``Makefile.am`` script that contains build instructions.
Then, ``autoconf`` is used to convert ``configure.ac`` into ``configure``,
while ``automake`` is used to convert ``Makefile.am`` into ``Makefile.in``.
``Makefile.in`` is used by ``configure`` to generate a platform-dependent
``Makefile`` for you. The following diagram provides a high-level overview
of the process:
.. figure:: Autoconf-automake-process.*
:target: https://commons.wikimedia.org/w/index.php?curid=15581407
`GNU autoconf and automake process for generating makefiles <https://commons.wikimedia.org/wiki/File:Autoconf-automake-process.svg>`_
by `Jdthood` under `CC BY-SA 3.0 <https://creativecommons.org/licenses/by-sa/3.0/deed.en>`_
If a ``configure`` script is not present in your tarball, you will
need to generate one yourself. Luckily, Spack already has an ``autoreconf``
phase to do most of the work for you. By default, the ``autoreconf``
phase runs:
.. code-block:: console
$ libtoolize
$ aclocal
$ autoreconf --install --verbose --force
All you need to do is add a few Autotools dependencies to the package.
Most stable releases will come with a ``configure`` script, but if you
check out a commit from the ``develop`` branch, you would want to add:
.. code-block:: python
depends_on('autoconf', type='build', when='@develop')
depends_on('automake', type='build', when='@develop')
depends_on('libtool', type='build', when='@develop')
depends_on('m4', type='build', when='@develop')
In some cases, developers might need to distribute a patch that modifies
one of the files used to generate ``configure`` or ``Makefile.in``.
In this case, these scripts will need to be regenerated. It is
preferable to regenerate these manually using the patch, and then
create a new patch that directly modifies ``configure``. That way,
Spack can use the secondary patch and additional build system
dependencies aren't necessary.
""""""""""""""""
force_autoreconf
""""""""""""""""
If for whatever reason you really want to add the original patch
and tell Spack to regenerate ``configure``, you can do so using the
following setting:
.. code-block:: python
force_autoreconf = True
This line tells Spack to wipe away the existing ``configure`` script
and generate a new one. If you only need to do this for a single
version, this can be done like so:
.. code-block:: python
@property
def force_autoreconf(self):
return self.version == Version('1.2.3'):
^^^^^^^^^^^^^^^^^^^^^^^
Finding configure flags
^^^^^^^^^^^^^^^^^^^^^^^
Once you have a ``configure`` script present, the next step is to
determine what option flags are available. These flags can be found
by running:
.. code-block:: console
$ ./configure --help
``configure`` will display a list of valid flags separated into
some or all of the following sections:
* Configuration
* Installation directories
* Fine tuning of the installation directories
* Program names
* X features
* System types
* **Optional Features**
* **Optional Packages**
* **Some influential environment variables**
For the most part, you can ignore all but the last 3 sections.
The "Optional Features" sections lists flags that enable/disable
features you may be interested in. The "Optional Packages" section
often lists dependencies and the flags needed to locate them. The
"environment variables" section lists environment variables that the
build system uses to pass flags to the compiler and linker.
^^^^^^^^^^^^^^^^^^^^^^^^^^
Addings flags to configure
^^^^^^^^^^^^^^^^^^^^^^^^^^
For most of the flags you encounter, you will want a variant to
optionally enable/disable them. You can then optionally pass these
flags to the ``configure`` call by overriding the ``configure_args``
function like so:
.. code-block:: python
def configure_args(self):
args = []
if '+mpi' in self.spec:
args.append('--enable-mpi')
else:
args.append('--disable-mpi')
return args
Note that we are explicitly disabling MPI support if it is not
requested. This is important, as many Autotools packages will enable
options by default if the dependencies are found, and disable them
otherwise. We want Spack installations to be as deterministic as possible.
If two users install a package with the same variants, the goal is that
both installations work the same way. See `here <https://www.linux.com/news/best-practices-autotools>`__
and `here <https://wiki.gentoo.org/wiki/Project:Quality_Assurance/Automagic_dependencies>`__
for a rationale as to why these so-called "automagic" dependencies
are a problem.
By default, Autotools installs packages to ``/usr``. We don't want this,
so Spack automatically adds ``--prefix=/path/to/installation/prefix``
to your list of ``configure_args``. You don't need to add this yourself.
^^^^^^^^^^^^^^^^
Helper functions
^^^^^^^^^^^^^^^^
You may have noticed that most of the Autotools flags are of the form
``--enable-foo``, ``--disable-bar``, ``--with-baz=<prefix>``, or
``--without-baz``. Since these flags are so common, Spack provides a
couple of helper functions to make your life easier.
TODO: document ``with_or_without`` and ``enable_or_disable``.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Configure script in a sub-directory
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Occasionally, developers will hide their source code and ``configure``
script in a subdirectory like ``src``. If this happens, Spack won't
be able to automatically detect the build system properly when running
``spack create``. You will have to manually change the package base
class and tell Spack where the ``configure`` script resides. You can
do this like so:
.. code-block:: python
configure_directory = 'src'
^^^^^^^^^^^^^^^^^^^^^^
Building out of source
^^^^^^^^^^^^^^^^^^^^^^
Some packages like ``gcc`` recommend building their software in a
different directory than the source code to prevent build pollution.
This can be done using the ``build_directory`` variable:
.. code-block:: python
build_directory = 'spack-build'
By default, Spack will build the package in the same directory that
contains the ``configure`` script
^^^^^^^^^^^^^^^^^^^^^^^^^
Build and install targets
^^^^^^^^^^^^^^^^^^^^^^^^^
For most Autotools packages, the usual:
.. code-block:: console
$ configure
$ make
$ make install
is sufficient to install the package. However, if you need to run
make with any other targets, for example, to build an optional
library or build the documentation, you can add these like so:
.. code-block:: python
build_targets = ['all', 'docs']
install_targets = ['install', 'docs']
^^^^^^^
Testing
^^^^^^^
Autotools-based packages typically provide unit testing via the
``check`` and ``installcheck`` targets. If you build your software
with ``spack install --test=root``, Spack will check for the presence
of a ``check`` or ``test`` target in the Makefile and run
``make check`` for you. After installation, it will check for an
``installcheck`` target and run ``make installcheck`` if it finds one.
^^^^^^^^^^^^^^^^^^^^^^
External documentation
^^^^^^^^^^^^^^^^^^^^^^
For more information on the Autotools build system, see:
https://www.gnu.org/software/automake/manual/html_node/Autotools-Introduction.html
.. _cmakepackage:
------------
CMakePackage
------------
Like Autotools, CMake is a widely-used build-script generator. Designed
by Kitware, CMake is the most popular build system for new C, C++, and
Fortran projects, and many older projects are switching to it as well.
Unlike Autotools, CMake can generate build scripts for builders other
than Make: Ninja, Visual Studio, etc. It is therefore cross-platform,
whereas Autotools is Unix-only.
^^^^^^
Phases
^^^^^^
The ``CMakePackage`` base class comes with the following phases:
#. ``cmake`` - generate the Makefile
#. ``build`` - build the package
#. ``install`` - install the package
By default, these phases run:
.. code-block:: console
$ mkdir spack-build
$ cd spack-build
$ cmake .. -DCMAKE_INSTALL_PREFIX=/path/to/installation/prefix
$ make
$ make test # optional
$ make install
A few more flags are passed to ``cmake`` by default, including flags
for setting the build type and flags for locating dependencies. Of
course, you may need to add a few arguments yourself.
^^^^^^^^^^^^^^^
Important files
^^^^^^^^^^^^^^^
A CMake-based package can be identified by the presence of a
``CMakeLists.txt`` file. This file defines the build flags that can be
passed to the cmake invocation, as well as linking instructions. If
you are familiar with CMake, it can prove very useful for determining
dependencies and dependency version requirements.
One thing to look for is the ``cmake_minimum_required`` function:
.. code-block:: cmake
cmake_minimum_required(VERSION 2.8.12)
This means that CMake 2.8.12 is the earliest release that will work.
You should specify this in a ``depends_on`` statement.
CMake-based packages may also contain ``CMakeLists.txt`` in subdirectories.
This modularization helps to manage complex builds in a hierarchical
fashion. Sometimes these nested ``CMakeLists.txt`` require additional
dependencies not mentioned in the top-level file.
There's also usually a ``cmake`` or ``CMake`` directory containing
additional macros, find scripts, etc. These may prove useful in
determining dependency version requirements.
^^^^^^^^^^^^^^^^^^^^^^^^^
Build system dependencies
^^^^^^^^^^^^^^^^^^^^^^^^^
Every package that uses the CMake build system requires a ``cmake``
dependency. Since this is always the case, the ``CMakePackage`` base
class already contains:
.. code-block:: python
depends_on('cmake', type='build')
If you need to specify a particular version requirement, you can
override this in your package:
.. code-block:: python
depends_on('cmake@2.8.12:', type='build')
^^^^^^^^^^^^^^^^^^^
Finding cmake flags
^^^^^^^^^^^^^^^^^^^
To get a list of valid flags that can be passed to ``cmake``, run the
following command in the directory that contains ``CMakeLists.txt``:
.. code-block:: console
$ cmake . -LAH
CMake will start by checking for compilers and dependencies. Eventually
it will begin to list build options. You'll notice that most of the
build options at the top are prefixed with ``CMAKE_``. You can safely
ignore most of these options as Spack already sets them for you. This
includes flags needed to locate dependencies, RPATH libraries, set the
installation directory, and set the build type.
The rest of the flags are the ones you should consider adding to your
package. They often include flags to enable/disable support for certain
features and locate specific dependencies. One thing you'll notice that
makes CMake different from Autotools is that CMake has an understanding
of build flag hierarchy. That is, certain flags will not display unless
their parent flag has been selected. For example, flags to specify the
``lib`` and ``include`` directories for a package might not appear
unless CMake found the dependency it was looking for. You may need to
manually specify certain flags to explore the full depth of supported
build flags, or check the ``CMakeLists.txt`` yourself.
^^^^^^^^^^^^^^^^^^^^^
Adding flags to cmake
^^^^^^^^^^^^^^^^^^^^^
To add additional flags to the ``cmake`` call, simply override the
``cmake_args`` function:
.. code-block:: python
def cmake_args(self):
args = []
if '+hdf5' in self.spec:
args.append('-DDETECT_HDF5=ON')
else:
args.append('-DDETECT_HDF5=OFF')
return args
^^^^^^^^^^
Generators
^^^^^^^^^^
CMake and Autotools are build-script generation tools; they "generate"
the Makefiles that are used to build a software package. CMake actually
supports multiple generators, not just Makefiles. Another common
generator is Ninja. To switch to the Ninja generator, simply add:
.. code-block:: python
generator = 'Ninja'
``CMakePackage`` defaults to "Unix Makefiles". If you switch to the
Ninja generator, make sure to add:
.. code-block:: python
depends_on('ninja', type='build')
to the package as well. Aside from that, you shouldn't need to do
anything else. Spack will automatically detect that you are using
Ninja and run:
.. code-block:: console
$ cmake .. -G Ninja
$ ninja
$ ninja install
Spack currently only supports "Unix Makefiles" and "Ninja" as valid
generators, but it should be simple to add support for alternative
generators. For more information on CMake generators, see:
https://cmake.org/cmake/help/latest/manual/cmake-generators.7.html
^^^^^^^^^^^^^^^^
CMAKE_BUILD_TYPE
^^^^^^^^^^^^^^^^
Every CMake-based package accepts a ``-DCMAKE_BUILD_TYPE`` flag to
dictate which level of optimization to use. In order to ensure
uniformity across packages, the ``CMakePackage`` base class adds
a variant to control this:
.. code-block:: python
variant('build_type', default='RelWithDebInfo',
description='CMake build type',
values=('Debug', 'Release', 'RelWithDebInfo', 'MinSizeRel'))
However, not every CMake package accepts all four of these options.
Grep the ``CMakeLists.txt`` file to see if the default values are
missing or replaced. For example, the
`dealii <https://github.com/spack/spack/blob/develop/var/spack/repos/builtin/packages/dealii/package.py>`_
package overrides the default variant with:
.. code-block:: python
variant('build_type', default='DebugRelease',
description='The build type to build',
values=('Debug', 'Release', 'DebugRelease'))
For more information on ``CMAKE_BUILD_TYPE``, see:
https://cmake.org/cmake/help/latest/variable/CMAKE_BUILD_TYPE.html
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
CMakeLists.txt in a sub-directory
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Occasionally, developers will hide their source code and ``CMakeLists.txt``
in a subdirectory like ``src``. If this happens, Spack won't
be able to automatically detect the build system properly when running
``spack create``. You will have to manually change the package base
class and tell Spack where ``CMakeLists.txt`` resides. You can do this
like so:
.. code-block:: python
root_cmakelists_dir = 'src'
Note that this path is relative to the root of the extracted tarball,
not to the ``build_directory``. It defaults to the current directory.
^^^^^^^^^^^^^^^^^^^^^^
Building out of source
^^^^^^^^^^^^^^^^^^^^^^
By default, Spack builds every ``CMakePackage`` in a ``spack-build``
sub-directory. If, for whatever reason, you would like to build in a
different sub-directory, simply override ``build_directory`` like so:
.. code-block:: python
build_directory = 'my-build'
^^^^^^^^^^^^^^^^^^^^^^^^^
Build and install targets
^^^^^^^^^^^^^^^^^^^^^^^^^
For most CMake packages, the usual:
.. code-block:: console
$ cmake
$ make
$ make install
is sufficient to install the package. However, if you need to run
make with any other targets, for example, to build an optional
library or build the documentation, you can add these like so:
.. code-block:: python
build_targets = ['all', 'docs']
install_targets = ['install', 'docs']
^^^^^^^
Testing
^^^^^^^
CMake-based packages typically provide unit testing via the
``test`` target. If you build your software with ``--test=root``,
Spack will check for the presence of a ``test`` target in the
Makefile and run ``make test`` for you. If you want to run a
different test instead, simply override the ``check`` method.
^^^^^^^^^^^^^^^^^^^^^^
External documentation
^^^^^^^^^^^^^^^^^^^^^^
For more information on the CMake build system, see:
https://cmake.org/cmake/help/latest/
.. _cudapackage:
-----------
CudaPackage
-----------
Different from other packages, ``CudaPackage`` does not represent a build
system. Instead its goal is to simplify and unify usage of ``CUDA`` in other
packages.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Provided variants and dependencies
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
``CudaPackage`` provides ``cuda`` variant (default to ``off``) to enable/disable
``CUDA``, and ``cuda_arch`` variant to optionally specify the architecture.
It also declares dependencies on the ``CUDA`` package ``depends_on('cuda@...')``
based on the architecture as well as specifies conflicts for certain compiler versions.
^^^^^
Usage
^^^^^
In order to use it, just add another base class to your package, for example:
.. code-block:: python
class MyPackage(CMakePackage, CudaPackage):
...
def cmake_args(self):
spec = self.spec
if '+cuda' in spec:
options.append('-DWITH_CUDA=ON')
cuda_arch = spec.variants['cuda_arch'].value
if cuda_arch is not None:
options.append('-DCUDA_FLAGS=-arch=sm_{0}'.format(cuda_arch[0]))
else:
options.append('-DWITH_CUDA=OFF')
.. _custompackage:
--------------------
Custom Build Systems
--------------------
While the build systems listed above should meet your needs for the
vast majority of packages, some packages provide custom build scripts.
This guide is intended for the following use cases:
* Packaging software with its own custom build system
* Adding support for new build systems
If you want to add support for a new build system, a good place to
start is to look at the definitions of other build systems. This guide
focuses mostly on how Spack's build systems work.
In this guide, we will be using the
`perl <https://github.com/spack/spack/blob/develop/var/spack/repos/builtin/packages/perl/package.py>`_ and
`cmake <https://github.com/spack/spack/blob/develop/var/spack/repos/builtin/packages/cmake/package.py>`_
packages as examples. ``perl``'s build system is a hand-written
``Configure`` shell script, while ``cmake`` bootstraps itself during
installation. Both of these packages require custom build systems.
^^^^^^^^^^
Base class
^^^^^^^^^^
If your package does not belong to any of the aforementioned build
systems that Spack already supports, you should inherit from the
``Package`` base class. ``Package`` is a simple base class with a
single phase: ``install``. If your package is simple, you may be able
to simply write an ``install`` method that gets the job done. However,
if your package is more complex and installation involves multiple
steps, you should add separate phases as mentioned in the next section.
If you are creating a new build system base class, you should inherit
from ``PackageBase``. This is the superclass for all build systems in
Spack.
^^^^^^
Phases
^^^^^^
The most important concept in Spack's build system support is the idea
of phases. Each build system defines a set of phases that are necessary
to install the package. They usually follow some sort of "configure",
"build", "install" guideline, but any of those phases may be missing
or combined with another phase.
If you look at the ``perl`` package, you'll see:
.. code-block:: python
phases = ['configure', 'build', 'install']
Similarly, ``cmake`` defines:
.. code-block:: python
phases = ['bootstrap', 'build', 'install']
If we look at the ``cmake`` example, this tells Spack's ``PackageBase``
class to run the ``bootstrap``, ``build``, and ``install`` functions
in that order. It is now up to you to define these methods.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Phase and phase_args functions
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
If we look at ``perl``, we see that it defines a ``configure`` method:
.. code-block:: python
def configure(self, spec, prefix):
configure = Executable('./Configure')
configure(*self.configure_args())
There is also a corresponding ``configure_args`` function that handles
all of the arguments to pass to ``Configure``, just like in
``AutotoolsPackage``. Comparatively, the ``build`` and ``install``
phases are pretty simple:
.. code-block:: python
def build(self, spec, prefix):
make()
def install(self, spec, prefix):
make('install')
The ``cmake`` package looks very similar, but with a ``bootstrap``
function instead of ``configure``:
.. code-block:: python
def bootstrap(self, spec, prefix):
bootstrap = Executable('./bootstrap')
bootstrap(*self.bootstrap_args())
def build(self, spec, prefix):
make()
def install(self, spec, prefix):
make('install')
Again, there is a ``boostrap_args`` function that determines the
correct bootstrap flags to use.
^^^^^^^^^^^^^^^^^^^^
run_before/run_after
^^^^^^^^^^^^^^^^^^^^
Occasionally, you may want to run extra steps either before or after
a given phase. This applies not just to custom build systems, but to
existing build systems as well. You may need to patch a file that is
generated by configure, or install extra files in addition to what
``make install`` copies to the installation prefix. This is where
``@run_before`` and ``@run_after`` come in.
These Python decorators allow you to write functions that are called
before or after a particular phase. For example, in ``perl``, we see:
.. code-block:: python
@run_after('install')
def install_cpanm(self):
spec = self.spec
if '+cpanm' in spec:
with working_dir(join_path('cpanm', 'cpanm')):
perl = spec['perl'].command
perl('Makefile.PL')
make()
make('install')
This extra step automatically installs ``cpanm`` in addition to the
base Perl installation.
^^^^^^^^^^^^^^^^^^^^^
on_package_attributes
^^^^^^^^^^^^^^^^^^^^^
The ``run_before``/``run_after`` logic discussed above becomes
particularly powerful when combined with the ``@on_package_attributes``
decorator. This decorator allows you to conditionally run certain
functions depending on the attributes of that package. The most
common example is conditional testing. Many unit tests are prone to
failure, even when there is nothing wrong with the installation.
Unfortunately, non-portable unit tests and tests that are
"supposed to fail" are more common than we would like. Instead of
always running unit tests on installation, Spack lets users
conditionally run tests with the ``--test=root`` flag.
If we wanted to define a function that would conditionally run
if and only if this flag is set, we would use the following line:
.. code-block:: python
@on_package_attributes(run_tests=True)
^^^^^^^
Testing
^^^^^^^
Let's put everything together and add unit tests to our package.
In the ``perl`` package, we can see:
.. code-block:: python
@run_after('build')
@on_package_attributes(run_tests=True)
def test(self):
make('test')
As you can guess, this runs ``make test`` *after* building the package,
if and only if testing is requested. Again, this is not specific to
custom build systems, it can be added to existing build systems as well.
Ideally, every package in Spack will have some sort of test to ensure
that it was built correctly. It is up to the package authors to make
sure this happens. If you are adding a package for some software and
the developers list commands to test the installation, please add these
tests to your ``package.py``.
.. warning::
The order of decorators matters. The following ordering:
.. code-block:: python
@run_after('install')
@on_package_attributes(run_tests=True)
works as expected. However, if you reverse the ordering:
.. code-block:: python
@on_package_attributes(run_tests=True)
@run_after('install')
the tests will always be run regardless of whether or not
``--test=root`` is requested. See https://github.com/spack/spack/issues/3833
for more information
.. _intelpackage:
------------
IntelPackage
------------
Intel provides many licensed software packages, which all share the
same basic steps for configuring and installing, as well as license
management.
This build system is a work-in-progress. See
https://github.com/spack/spack/pull/4300 and
https://github.com/spack/spack/pull/7469 for more information.
.. _makefilepackage:
---------------
MakefilePackage
---------------
The most primitive build system a package can use is a plain Makefile.
Makefiles are simple to write for small projects, but they usually
require you to edit the Makefile to set platform and compiler-specific
variables.
^^^^^^
Phases
^^^^^^
The ``MakefilePackage`` base class comes with 3 phases:
#. ``edit`` - edit the Makefile
#. ``build`` - build the project
#. ``install`` - install the project
By default, ``edit`` does nothing, but you can override it to replace
hard-coded Makefile variables. The ``build`` and ``install`` phases
run:
.. code-block:: console
$ make
$ make install
^^^^^^^^^^^^^^^
Important files
^^^^^^^^^^^^^^^
The main file that matters for a ``MakefilePackage`` is the Makefile.
This file will be named one of the following ways:
* GNUmakefile (only works with GNU Make)
* Makefile (most common)
* makefile
Some Makefiles also *include* other configuration files. Check for an
``include`` directive in the Makefile.
^^^^^^^^^^^^^^^^^^^^^^^^^
Build system dependencies
^^^^^^^^^^^^^^^^^^^^^^^^^
Spack assumes that the operating system will have a valid ``make`` utility
installed already, so you don't need to add a dependency on ``make``.
However, if the package uses a ``GNUmakefile`` or the developers recommend
using GNU Make, you should add a dependency on ``gmake``:
.. code-block:: python
depends_on('gmake', type='build')
^^^^^^^^^^^^^^^^^^^^^^^^^^
Types of Makefile packages
^^^^^^^^^^^^^^^^^^^^^^^^^^
Most of the work involved in packaging software that uses Makefiles
involves overriding or replacing hard-coded variables. Many packages
make the mistake of hard-coding compilers, usually for GCC or Intel.
This is fine if you happen to be using that particular compiler, but
Spack is designed to work with *any* compiler, and you need to ensure
that this is the case.
Depending on how the Makefile is designed, there are 4 common strategies
that can be used to set or override the appropriate variables:
"""""""""""""""""""""
Environment variables
"""""""""""""""""""""
Make has multiple types of
`assignment operators <https://www.gnu.org/software/make/manual/make.html#Setting>`_.
Some Makefiles use ``=`` to assign variables. The only way to override
these variables is to edit the Makefile or override them on the
command-line. However, Makefiles that use ``?=`` for assignment honor
environment variables. Since Spack already sets ``CC``, ``CXX``, ``F77``,
and ``FC``, you won't need to worry about setting these variables. If
there are any other variables you need to set, you can do this in the
``edit`` method:
.. code-block:: python
def edit(self, spec, prefix):
env['PREFIX'] = prefix
env['BLASLIB'] = spec['blas'].libs.ld_flags
`cbench <https://github.com/spack/spack/blob/develop/var/spack/repos/builtin/packages/cbench/package.py>`_
is a good example of a simple package that does this, while
`esmf <https://github.com/spack/spack/blob/develop/var/spack/repos/builtin/packages/esmf/package.py>`_
is a good example of a more complex package.
""""""""""""""""""""""
Command-line arguments
""""""""""""""""""""""
If the Makefile ignores environment variables, the next thing to try
is command-line arguments. You can do this by overriding the
``build_targets`` attribute. If you don't need access to the spec,
you can do this like so:
.. code-block:: python
build_targets = ['CC=cc']
If you do need access to the spec, you can create a property like so:
.. code-block:: python
@property
def build_targets(self):
spec = self.spec
return [
'CC=cc',
'BLASLIB={0}'.format(spec['blas'].libs.ld_flags),
]
`cloverleaf <https://github.com/spack/spack/blob/develop/var/spack/repos/builtin/packages/cloverleaf/package.py>`_
is a good example of a package that uses this strategy.
"""""""""""""
Edit Makefile
"""""""""""""
Some Makefiles are just plain stubborn and will ignore command-line
variables. The only way to ensure that these packages build correctly
is to directly edit the Makefile. Spack provides a ``FileFilter`` class
and a ``filter_file`` method to help with this. For example:
.. code-block:: python
def edit(self, spec, prefix):
makefile = FileFilter('Makefile')
makefile.filter('CC = gcc', 'CC = cc')
makefile.filter('CXX = g++', 'CC = c++')
`stream <https://github.com/spack/spack/blob/develop/var/spack/repos/builtin/packages/stream/package.py>`_
is a good example of a package that involves editing a Makefile to set
the appropriate variables.
"""""""""""
Config file
"""""""""""
More complex packages often involve Makefiles that *include* a
configuration file. These configuration files are primarily composed
of variables relating to the compiler, platform, and the location of
dependencies or names of libraries. Since these config files are
dependent on the compiler and platform, you will often see entire
directories of examples for common compilers and architectures. Use
these examples to help determine what possible values to use.
If the config file is long and only contains one or two variables
that need to be modified, you can use the technique above to edit
the config file. However, if you end up needing to modify most of
the variables, it may be easier to write a new file from scratch.
If each variable is independent of each other, a dictionary works
well for storing variables:
.. code-block:: python
def edit(self, spec, prefix):
config = {
'CC': 'cc',
'MAKE': 'make',
}
if '+blas' in spec:
config['BLAS_LIBS'] = spec['blas'].libs.joined()
with open('make.inc', 'w') as inc:
for key in config:
inc.write('{0} = {1}\n'.format(key, config[key]))
`elk <https://github.com/spack/spack/blob/develop/var/spack/repos/builtin/packages/elk/package.py>`_
is a good example of a package that uses a dictionary to store
configuration variables.
If the order of variables is important, it may be easier to store
them in a list:
.. code-block:: python
def edit(self, spec, prefix):
config = [
'INSTALL_DIR = {0}'.format(prefix),
'INCLUDE_DIR = $(INSTALL_DIR)/include',
'LIBRARY_DIR = $(INSTALL_DIR)/lib',
]
with open('make.inc', 'w') as inc:
for var in config:
inc.write('{0}\n'.format(var))
`hpl <https://github.com/spack/spack/blob/develop/var/spack/repos/builtin/packages/hpl/package.py>`_
is a good example of a package that uses a list to store
configuration variables.
^^^^^^^^^^^^^^^^^^^^^^^^^^
Variables to watch out for
^^^^^^^^^^^^^^^^^^^^^^^^^^
The following is a list of common variables to watch out for. The first
two sections are
`implicit variables <https://www.gnu.org/software/make/manual/html_node/Implicit-Variables.html>`_
defined by Make and will always use the same name, while the rest are
user-defined variables and may vary from package to package.
* **Compilers**
This includes variables such as ``CC``, ``CXX``, ``F77``, ``F90``,
and ``FC``, as well as variables related to MPI compiler wrappers,
like ``MPICC`` and friends.
* **Compiler flags**
This includes variables for specific compilers, like ``CFLAGS``,
``CXXFLAGS``, ``F77FLAGS``, ``F90FLAGS``, ``FCFLAGS``, and ``CPPFLAGS``.
These variables are often hard-coded to contain flags specific to a
certain compiler. If these flags don't work for every compiler,
you may want to consider filtering them.
* **Variables that enable or disable features**
This includes variables like ``MPI``, ``OPENMP``, ``PIC``, and
``DEBUG``. These flags often require you to create a variant
so that you can either build with or without MPI support, for
example. These flags are often compiler-dependent. You should
replace them with the appropriate compiler flags, such as
``self.compiler.openmp_flag`` or ``self.compiler.pic_flag``.
* **Platform flags**
These flags control the type of architecture that the executable
is compiler for. Watch out for variables like ``PLAT`` or ``ARCH``.
* **Dependencies**
Look out for variables that sound like they could be used to
locate dependencies, such as ``JAVA_HOME``, ``JPEG_ROOT``, or
``ZLIBDIR``. Also watch out for variables that control linking,
such as ``LIBS``, ``LDFLAGS``, and ``INCLUDES``. These variables
need to be set to the installation prefix of a dependency, or
to the correct linker flags to link to that dependency.
* **Installation prefix**
If your Makefile has an ``install`` target, it needs some way of
knowing where to install. By default, many packages install to
``/usr`` or ``/usr/local``. Since many Spack users won't have
sudo privileges, it is imperative that each package is installed
to the proper prefix. Look for variables like ``PREFIX`` or
``INSTALL``.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Makefiles in a sub-directory
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Not every package places their Makefile in the root of the package
tarball. If the Makefile is in a sub-directory like ``src``, you
can tell Spack where to locate it like so:
.. code-block:: python
build_directory = 'src'
^^^^^^^^^^^^^^^^^^^
Manual installation
^^^^^^^^^^^^^^^^^^^
Not every Makefile includes an ``install`` target. If this is the
case, you can override the default ``install`` method to manually
install the package:
.. code-block:: python
def install(self, spec, prefix):
mkdir(prefix.bin)
install('foo', prefix.bin)
install_tree('lib', prefix.lib)
^^^^^^^^^^^^^^^^^^^^^^
External documentation
^^^^^^^^^^^^^^^^^^^^^^
For more information on reading and writing Makefiles, see:
https://www.gnu.org/software/make/manual/make.html
.. _octavepackage:
-------------
OctavePackage
-------------
Octave has its own build system for installing packages.
^^^^^^
Phases
^^^^^^
The ``OctavePackage`` base class has a single phase:
#. ``install`` - install the package
By default, this phase runs the following command:
.. code-block:: console
$ octave '--eval' 'pkg prefix <prefix>; pkg install <archive_file>'
Beware that uninstallation is not implemented at the moment. After uninstalling
a package via Spack, you also need to manually uninstall it from Octave via
``pkg uninstall <package_name>``.
^^^^^^^^^^^^^^^^^^^^^^^
Finding Octave packages
^^^^^^^^^^^^^^^^^^^^^^^
Most Octave packages are listed at https://octave.sourceforge.io/packages.php.
^^^^^^^^^^^^
Dependencies
^^^^^^^^^^^^
Usually, the homepage of a package will list dependencies, i.e.
``Dependencies: Octave >= 3.6.0 struct >= 1.0.12``. The same information should
be available in the ``DESCRIPTION`` file in the root of each archive.
^^^^^^^^^^^^^^^^^^^^^^
External Documentation
^^^^^^^^^^^^^^^^^^^^^^
For more information on the Octave build system, see:
https://octave.org/doc/v4.4.0/Installing-and-Removing-Packages.html
.. _perlpackage:
-----------
PerlPackage
-----------
Much like Octave, Perl has its own language-specific
build system.
^^^^^^
Phases
^^^^^^
The ``PerlPackage`` base class comes with 3 phases that can be overridden:
#. ``configure`` - configure the package
#. ``build`` - build the package
#. ``install`` - install the package
Perl packages have 2 common modules used for module installation:
"""""""""""""""""""""""
``ExtUtils::MakeMaker``
"""""""""""""""""""""""
The ``ExtUtils::MakeMaker`` module is just what it sounds like, a module
designed to generate Makefiles. It can be identified by the presence of
a ``Makefile.PL`` file, and has the following installation steps:
.. code-block:: console
$ perl Makefile.PL INSTALL_BASE=/path/to/installation/prefix
$ make
$ make test # optional
$ make install
"""""""""""""""""
``Module::Build``
"""""""""""""""""
The ``Module::Build`` module is a pure-Perl build system, and can be
identified by the presence of a ``Build.PL`` file. It has the following
installation steps:
.. code-block:: console
$ perl Build.PL --install_base /path/to/installation/prefix
$ ./Build
$ ./Build test # optional
$ ./Build install
If both ``Makefile.PL`` and ``Build.PL`` files exist in the package,
Spack will use ``Makefile.PL`` by default. If your package uses a
different module, ``PerlPackage`` will need to be extended to support
it.
``PerlPackage`` automatically detects which build steps to use, so there
shouldn't be much work on the package developer's side to get things
working.
^^^^^^^^^^^^^^^^^^^^^
Finding Perl packages
^^^^^^^^^^^^^^^^^^^^^
Most Perl modules are hosted on CPAN - The Comprehensive Perl Archive
Network. If you need to find a package for ``XML::Parser``, for example,
you should search for "CPAN XML::Parser".
Some CPAN pages are versioned. Check for a link to the
"Latest Release" to make sure you have the latest version.
^^^^^^^^^^^^
Package name
^^^^^^^^^^^^
When you use ``spack create`` to create a new Perl package, Spack will
automatically prepend ``perl-`` to the front of the package name. This
helps to keep Perl modules separate from other packages. The same
naming scheme is used for other language extensions, like Python and R.
^^^^^^^^^^^
Description
^^^^^^^^^^^
Most CPAN pages have a short description under "NAME" and a longer
description under "DESCRIPTION". Use whichever you think is more
useful while still being succinct.
^^^^^^^^
Homepage
^^^^^^^^
In the top-right corner of the CPAN page, you'll find a "permalink"
for the package. This should be used instead of the current URL, as
it doesn't contain the version number and will always link to the
latest release.
^^^
URL
^^^
If you haven't found it already, the download URL is on the right
side of the page below the permalink. Search for "Download".
^^^^^^^^^^^^^^^^^^^^^^^^^
Build system dependencies
^^^^^^^^^^^^^^^^^^^^^^^^^
Every ``PerlPackage`` obviously depends on Perl at build and run-time,
so ``PerlPackage`` contains:
.. code-block:: python
extends('perl')
depends_on('perl', type=('build', 'run'))
If your package requires a specific version of Perl, you should
specify this.
Although newer versions of Perl include ``ExtUtils::MakeMaker`` and
``Module::Build`` as "core" modules, you may want to add dependencies
on ``perl-extutils-makemaker`` and ``perl-module-build`` anyway. Many
people add Perl as an external package, and we want the build to work
properly. If your package uses ``Makefile.PL`` to build, add:
.. code-block:: python
depends_on('perl-extutils-makemaker', type='build')
If your package uses ``Build.PL`` to build, add:
.. code-block:: python
depends_on('perl-module-build', type='build')
^^^^^^^^^^^^^^^^^
Perl dependencies
^^^^^^^^^^^^^^^^^
Below the download URL, you will find a "Dependencies" link, which
takes you to a page listing all of the dependencies of the package.
Packages listed as "Core module" don't need to be added as dependencies,
but all direct dependencies should be added. Don't add dependencies of
dependencies. These should be added as dependencies to the dependency,
not to your package.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Passing arguments to configure
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Packages that have non-Perl dependencies often use command-line
variables to specify their installation directory. You can pass
arguments to ``Makefile.PL`` or ``Build.PL`` by overriding
``configure_args`` like so:
.. code-block:: python
def configure_args(self):
expat = self.spec['expat'].prefix
return [
'EXPATLIBPATH={0}'.format(expat.lib),
'EXPATINCPATH={0}'.format(expat.include),
]
^^^^^^^^^^^^^^^^^^^^^
Alternatives to Spack
^^^^^^^^^^^^^^^^^^^^^
If you need to maintain a stack of Perl modules for a user and don't
want to add all of them to Spack, a good alternative is ``cpanm``.
If Perl is already installed on your system, it should come with a
``cpan`` executable. To install ``cpanm``, run the following command:
.. code-block:: console
$ cpan App::cpanminus
Now, you can install any Perl module you want by running:
.. code-block:: console
$ cpanm Module::Name
Obviously, these commands can only be run if you have root privileges.
Furthermore, ``cpanm`` is not capable of installing non-Perl dependencies.
If you need to install to your home directory or need to install a module
with non-Perl dependencies, Spack is a better option.
^^^^^^^^^^^^^^^^^^^^^^
External documentation
^^^^^^^^^^^^^^^^^^^^^^
You can find more information on installing Perl modules from source
at: http://www.perlmonks.org/?node_id=128077
More generic Perl module installation instructions can be found at:
http://www.cpan.org/modules/INSTALL.html
This diff is collapsed.
.. _qmakepackage:
------------
QMakePackage
------------
Much like Autotools and CMake, QMake is a build-script generator
designed by the developers of Qt. In its simplest form, Spack's
``QMakePackage`` runs the following steps:
.. code-block:: console
$ qmake
$ make
$ make check # optional
$ make install
QMake does not appear to have a standardized way of specifying
the installation directory, so you may have to set environment
variables or edit ``*.pro`` files to get things working properly.
^^^^^^
Phases
^^^^^^
The ``QMakePackage`` base class comes with the following phases:
#. ``qmake`` - generate Makefiles
#. ``build`` - build the project
#. ``install`` - install the project
By default, these phases run:
.. code-block:: console
$ qmake
$ make
$ make install
Any of these phases can be overridden in your package as necessary.
There is also a ``check`` method that looks for a ``check`` target
in the Makefile. If a ``check`` target exists and the user runs:
.. code-block:: console
$ spack install --test=root <qmake-package>
Spack will run ``make check`` after the build phase.
^^^^^^^^^^^^^^^
Important files
^^^^^^^^^^^^^^^
Packages that use the QMake build system can be identified by the
presence of a ``<project-name>.pro`` file. This file declares things
like build instructions and dependencies.
One thing to look for is the ``minQtVersion`` function:
.. code-block:: none
minQtVersion(5, 6, 0)
This means that Qt 5.6.0 is the earliest release that will work.
You should specify this in a ``depends_on`` statement.
^^^^^^^^^^^^^^^^^^^^^^^^^
Build system dependencies
^^^^^^^^^^^^^^^^^^^^^^^^^
At the bare minimum, packages that use the QMake build system need a
``qt`` dependency. Since this is always the case, the ``QMakePackage``
base class already contains:
.. code-block:: python
depends_on('qt', type='build')
If you want to specify a particular version requirement, or need to
link to the ``qt`` libraries, you can override this in your package:
.. code-block:: python
depends_on('qt@5.6.0:')
^^^^^^^^^^^^^^^^^^^^^^^^^^
Passing arguments to qmake
^^^^^^^^^^^^^^^^^^^^^^^^^^
If you need to pass any arguments to the ``qmake`` call, you can
override the ``qmake_args`` method like so:
.. code-block:: python
def qmake_args(self):
return ['-recursive']
This method can be used to pass flags as well as variables.
^^^^^^^^^^^^^^^^^^^^^^
External documentation
^^^^^^^^^^^^^^^^^^^^^^
For more information on the QMake build system, see:
http://doc.qt.io/qt-5/qmake-manual.html
.. _rpackage:
--------
RPackage
--------
Like Python, R has its own built-in build system.
The R build system is remarkably uniform and well-tested.
This makes it one of the easiest build systems to create
new Spack packages for.
^^^^^^
Phases
^^^^^^
The ``RPackage`` base class has a single phase:
#. ``install`` - install the package
By default, this phase runs the following command:
.. code-block:: console
$ R CMD INSTALL --library=/path/to/installation/prefix/rlib/R/library .
^^^^^^^^^^^^^^^^^^
Finding R packages
^^^^^^^^^^^^^^^^^^
The vast majority of R packages are hosted on CRAN - The Comprehensive
R Archive Network. If you are looking for a particular R package, search
for "CRAN <package-name>" and you should quickly find what you want.
If it isn't on CRAN, try Bioconductor, another common R repository.
For the purposes of this tutorial, we will be walking through
`r-caret <https://github.com/spack/spack/blob/develop/var/spack/repos/builtin/packages/r-caret/package.py>`_
as an example. If you search for "CRAN caret", you will quickly find what
you are looking for at https://cran.r-project.org/web/packages/caret/index.html.
If you search for "Package source", you will find the download URL for
the latest release. Use this URL with ``spack create`` to create a new
package.
^^^^^^^^^^^^
Package name
^^^^^^^^^^^^
The first thing you'll notice is that Spack prepends ``r-`` to the front
of the package name. This is how Spack separates R package extensions
from the rest of the packages in Spack. Without this, we would end up
with package name collisions more frequently than we would like. For
instance, there are already packages for both:
* ``ape`` and ``r-ape``
* ``curl`` and ``r-curl``
* ``gmp`` and ``r-gmp``
* ``jpeg`` and ``r-jpeg``
* ``openssl`` and ``r-openssl``
* ``uuid`` and ``r-uuid``
* ``xts`` and ``r-xts``
Many popular programs written in C/C++ are later ported to R as a
separate project.
^^^^^^^^^^^
Description
^^^^^^^^^^^
The first thing you'll need to add to your new package is a description.
The top of the homepage for ``caret`` lists the following description:
caret: Classification and Regression Training
Misc functions for training and plotting classification and regression models.
You can either use the short description (first line), long description
(second line), or both depending on what you feel is most appropriate.
^^^^^^^^
Homepage
^^^^^^^^
If you look at the bottom of the page, you'll see:
Linking:
Please use the canonical form https://CRAN.R-project.org/package=caret to link to this page.
Please uphold the wishes of the CRAN admins and use
https://CRAN.R-project.org/package=caret as the homepage instead of
https://cran.r-project.org/web/packages/caret/index.html. The latter may
change without notice.
^^^
URL
^^^
As previously mentioned, the download URL for the latest release can be
found by searching "Package source" on the homepage.
^^^^^^^^
List URL
^^^^^^^^
CRAN maintains a single webpage containing the latest release of every
single package: https://cran.r-project.org/src/contrib/
Of course, as soon as a new release comes out, the version you were using
in your package is no longer available at that URL. It is moved to an
archive directory. If you search for "Old sources", you will find:
https://cran.r-project.org/src/contrib/Archive/caret
If you only specify the URL for the latest release, your package will
no longer be able to fetch that version as soon as a new release comes
out. To get around this, add the archive directory as a ``list_url``.
^^^^^^^^^^^^^^^^^^^^^^^^^
Build system dependencies
^^^^^^^^^^^^^^^^^^^^^^^^^
As an extension of the R ecosystem, your package will obviously depend
on R to build and run. Normally, we would use ``depends_on`` to express
this, but for R packages, we use ``extends``. ``extends`` is similar to
``depends_on``, but adds an additional feature: the ability to "activate"
the package by symlinking it to the R installation directory. Since
every R package needs this, the ``RPackage`` base class contains:
.. code-block:: python
extends('r')
depends_on('r', type=('build', 'run'))
Take a close look at the homepage for ``caret``. If you look at the
"Depends" section, you'll notice that ``caret`` depends on "R (≥ 2.10)".
You should add this to your package like so:
.. code-block:: python
depends_on('r@2.10:', type=('build', 'run'))
^^^^^^^^^^^^^^
R dependencies
^^^^^^^^^^^^^^
R packages are often small and follow the classic Unix philosophy
of doing one thing well. They are modular and usually depend on
several other packages. You may find a single package with over a
hundred dependencies. Luckily, CRAN packages are well-documented
and list all of their dependencies in the following sections:
* Depends
* Imports
* LinkingTo
As far as Spack is concerned, all 3 of these dependency types
correspond to ``type=('build', 'run')``, so you don't have to worry
about them. If you are curious what they mean,
https://github.com/spack/spack/issues/2951 has a pretty good summary:
``Depends`` is required and will cause those R packages to be *attached*,
that is, their APIs are exposed to the user. ``Imports`` *loads* packages
so that *the package* importing these packages can access their APIs,
while *not* being exposed to the user. When a user calls ``library(foo)``
s/he *attaches* package ``foo`` and all of the packages under ``Depends``.
Any function in one of these package can be called directly as ``bar()``.
If there are conflicts, user can also specify ``pkgA::bar()`` and
``pkgB::bar()`` to distinguish between them. Historically, there was only
``Depends`` and ``Suggests``, hence the confusing names. Today, maybe
``Depends`` would have been named ``Attaches``.
The ``LinkingTo`` is not perfect and there was recently an extensive
discussion about API/ABI among other things on the R-devel mailing
list among very skilled R developers:
* https://stat.ethz.ch/pipermail/r-devel/2016-December/073505.html
* https://stat.ethz.ch/pipermail/r-devel/2017-January/073647.html
Some packages also have a fourth section:
* Suggests
These are optional, rarely-used dependencies that a user might find
useful. You should **NOT** add these dependencies to your package.
R packages already have enough dependencies as it is, and adding
optional dependencies can really slow down the concretization
process. They can also introduce circular dependencies.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Core, recommended, and non-core packages
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
If you look at "Depends", "Imports", and "LinkingTo", you will notice
3 different types of packages:
"""""""""""""
Core packages
"""""""""""""
If you look at the ``caret`` homepage, you'll notice a few dependencies
that don't have a link to the package, like ``methods``, ``stats``, and
``utils``. These packages are part of the core R distribution and are
tied to the R version installed. You can basically consider these to be
"R itself". These are so essential to R so it would not make sense that
they could be updated via CRAN. If so, you would basically get a different
version of R. Thus, they're updated when R is updated.
You can find a list of these core libraries at:
https://github.com/wch/r-source/tree/trunk/src/library
""""""""""""""""""""
Recommended packages
""""""""""""""""""""
When you install R, there is an option called ``--with-recommended-packages``.
This flag causes the R installation to include a few "Recommended" packages
(legacy term). They are for historical reasons quite tied to the core R
distribution, developed by the R core team or people closely related to it.
The R core distribution "knows" about these package, but they are indeed
distributed via CRAN. Because they're distributed via CRAN, they can also be
updated between R version releases.
Spack explicitly adds the ``--without-recommended-packages`` flag to prevent
the installation of these packages. Due to the way Spack handles package
activation (symlinking packages to the R installation directory),
pre-existing recommended packages will cause conflicts for already-existing
files. We could either not include these recommended packages in Spack and
require them to be installed through ``--with-recommended-packages``, or
we could not install them with R and let users choose the version of the
package they want to install. We chose the latter.
Since these packages are so commonly distributed with the R system, many
developers may assume these packages exist and fail to list them as
dependencies. Watch out for this.
You can find a list of these recommended packages at:
https://github.com/wch/r-source/blob/trunk/share/make/vars.mk
"""""""""""""""""
Non-core packages
"""""""""""""""""
These are packages that are neither "core" nor "recommended". There are more
than 10,000 of these packages hosted on CRAN alone.
For each of these package types, if you see that a specific version is
required, for example, "lattice (≥ 0.20)", please add this information to
the dependency:
.. code-block:: python
depends_on('r-lattice@0.20:', type=('build', 'run'))
^^^^^^^^^^^^^^^^^^
Non-R dependencies
^^^^^^^^^^^^^^^^^^
Some packages depend on non-R libraries for linking. Check out the
`r-stringi <https://github.com/spack/spack/blob/develop/var/spack/repos/builtin/packages/r-stringi/package.py>`_
package for an example: https://CRAN.R-project.org/package=stringi.
If you search for the text "SystemRequirements", you will see:
ICU4C (>= 52, optional)
This is how non-R dependencies are listed. Make sure to add these
dependencies. The default dependency type should suffice.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Passing arguments to the installation
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Some R packages provide additional flags that can be passed to
``R CMD INSTALL``, often to locate non-R dependencies.
`r-rmpi <https://github.com/spack/spack/blob/develop/var/spack/repos/builtin/packages/r-rmpi/package.py>`_
is an example of this, and flags for linking to an MPI library. To pass
these to the installation command, you can override ``configure_args``
like so:
.. code-block:: python
def configure_args(self, spec, prefix):
mpi_name = spec['mpi'].name
# The type of MPI. Supported values are:
# OPENMPI, LAM, MPICH, MPICH2, or CRAY
if mpi_name == 'openmpi':
Rmpi_type = 'OPENMPI'
elif mpi_name == 'mpich':
Rmpi_type = 'MPICH2'
else:
raise InstallError('Unsupported MPI type')
return [
'--with-Rmpi-type={0}'.format(Rmpi_type),
'--with-mpi={0}'.format(spec['mpi'].prefix),
]
There is a similar ``configure_vars`` function that can be overridden
to pass variables to the build.
^^^^^^^^^^^^^^^^^^^^^
Alternatives to Spack
^^^^^^^^^^^^^^^^^^^^^
CRAN hosts over 10,000 R packages, most of which are not in Spack. Many
users may not need the advanced features of Spack, and may prefer to
install R packages the normal way:
.. code-block:: console
$ R
> install.packages("ggplot2")
R will search CRAN for the ``ggplot2`` package and install all necessary
dependencies for you. If you want to update all installed R packages to
the latest release, you can use:
.. code-block:: console
> update.packages(ask = FALSE)
This works great for users who have internet access, but those on an
air-gapped cluster will find it easier to let Spack build a download
mirror and install these packages for you.
Where Spack really shines is its ability to install non-R dependencies
and link to them properly, something the R installation mechanism
cannot handle.
^^^^^^^^^^^^^^^^^^^^^^
External documentation
^^^^^^^^^^^^^^^^^^^^^^
For more information on installing R packages, see:
https://stat.ethz.ch/R-manual/R-devel/library/utils/html/INSTALL.html
.. _rubypackage:
-----------
RubyPackage
-----------
Like Perl, Python, and R, Ruby has its own build system for
installing Ruby gems.
This build system is a work-in-progress. See
https://github.com/spack/spack/pull/3127 for more information.
.. _sconspackage:
------------
SConsPackage
------------
SCons is a general-purpose build system that does not rely on
Makefiles to build software. SCons is written in Python, and handles
all building and linking itself.
As far as build systems go, SCons is very non-uniform. It provides a
common framework for developers to write build scripts, but the build
scripts themselves can vary drastically. Some developers add subcommands
like:
.. code-block:: console
$ scons clean
$ scons build
$ scons test
$ scons install
Others don't add any subcommands. Some have configuration options that
can be specified through variables on the command line. Others don't.
^^^^^^
Phases
^^^^^^
As previously mentioned, SCons allows developers to add subcommands like
``build`` and ``install``, but by default, installation usually looks like:
.. code-block:: console
$ scons
$ scons install
To facilitate this, the ``SConsPackage`` base class provides the
following phases:
#. ``build`` - build the package
#. ``install`` - install the package
Package developers often add unit tests that can be invoked with
``scons test`` or ``scons check``. Spack provides a ``test`` method
to handle this. Since we don't know which one the package developer
chose, the ``test`` method does nothing by default, but can be easily
overridden like so:
.. code-block:: python
def test(self):
scons('check')
^^^^^^^^^^^^^^^
Important files
^^^^^^^^^^^^^^^
SCons packages can be identified by their ``SConstruct`` files. These
files handle everything from setting up subcommands and command-line
options to linking and compiling.
One thing to look for is the ``EnsureSConsVersion`` function:
.. code-block:: none
EnsureSConsVersion(2, 3, 0)
This means that SCons 2.3.0 is the earliest release that will work.
You should specify this in a ``depends_on`` statement.
^^^^^^^^^^^^^^^^^^^^^^^^^
Build system dependencies
^^^^^^^^^^^^^^^^^^^^^^^^^
At the bare minimum, packages that use the SCons build system need a
``scons`` dependency. Since this is always the case, the ``SConsPackage``
base class already contains:
.. code-block:: python
depends_on('scons', type='build')
If you want to specify a particular version requirement, you can override
this in your package:
.. code-block:: python
depends_on('scons@2.3.0:', type='build')
^^^^^^^^^^^^^^^^^^^^^^^^^
Finding available options
^^^^^^^^^^^^^^^^^^^^^^^^^
The first place to start when looking for a list of valid options to
build a package is ``scons --help``. Some packages like
`kahip <https://github.com/spack/spack/blob/develop/var/spack/repos/builtin/packages/kahip/package.py>`_
don't bother overwriting the default SCons help message, so this isn't
very useful, but other packages like
`serf <https://github.com/spack/spack/blob/develop/var/spack/repos/builtin/packages/serf/package.py>`_
print a list of valid command-line variables:
.. code-block:: console
$ scons --help
scons: Reading SConscript files ...
Checking for GNU-compatible C compiler...yes
scons: done reading SConscript files.
PREFIX: Directory to install under ( /path/to/PREFIX )
default: /usr/local
actual: /usr/local
LIBDIR: Directory to install architecture dependent libraries under ( /path/to/LIBDIR )
default: $PREFIX/lib
actual: /usr/local/lib
APR: Path to apr-1-config, or to APR's install area ( /path/to/APR )
default: /usr
actual: /usr
APU: Path to apu-1-config, or to APR's install area ( /path/to/APU )
default: /usr
actual: /usr
OPENSSL: Path to OpenSSL's install area ( /path/to/OPENSSL )
default: /usr
actual: /usr
ZLIB: Path to zlib's install area ( /path/to/ZLIB )
default: /usr
actual: /usr
GSSAPI: Path to GSSAPI's install area ( /path/to/GSSAPI )
default: None
actual: None
DEBUG: Enable debugging info and strict compile warnings (yes|no)
default: False
actual: False
APR_STATIC: Enable using a static compiled APR (yes|no)
default: False
actual: False
CC: Command name or path of the C compiler
default: None
actual: gcc
CFLAGS: Extra flags for the C compiler (space-separated)
default: None
actual:
LIBS: Extra libraries passed to the linker, e.g. "-l<library1> -l<library2>" (space separated)
default: None
actual: None
LINKFLAGS: Extra flags for the linker (space-separated)
default: None
actual:
CPPFLAGS: Extra flags for the C preprocessor (space separated)
default: None
actual: None
Use scons -H for help about command-line options.
More advanced packages like
`cantera <https://github.com/spack/spack/blob/develop/var/spack/repos/builtin/packages/cantera/package.py>`_
use ``scons --help`` to print a list of subcommands:
.. code-block:: console
$ scons --help
scons: Reading SConscript files ...
SCons build script for Cantera
Basic usage:
'scons help' - print a description of user-specifiable options.
'scons build' - Compile Cantera and the language interfaces using
default options.
'scons clean' - Delete files created while building Cantera.
'[sudo] scons install' - Install Cantera.
'[sudo] scons uninstall' - Uninstall Cantera.
'scons test' - Run all tests which did not previously pass or for which the
results may have changed.
'scons test-reset' - Reset the passing status of all tests.
'scons test-clean' - Delete files created while running the tests.
'scons test-help' - List available tests.
'scons test-NAME' - Run the test named "NAME".
'scons <command> dump' - Dump the state of the SCons environment to the
screen instead of doing <command>, e.g.
'scons build dump'. For debugging purposes.
'scons samples' - Compile the C++ and Fortran samples.
'scons msi' - Build a Windows installer (.msi) for Cantera.
'scons sphinx' - Build the Sphinx documentation
'scons doxygen' - Build the Doxygen documentation
You'll notice that cantera provides a ``scons help`` subcommand. Running
``scons help`` prints a list of valid command-line variables.
^^^^^^^^^^^^^^^^^^^^^^^^^^
Passing arguments to scons
^^^^^^^^^^^^^^^^^^^^^^^^^^
Now that you know what arguments the project accepts, you can add them to
the package build phase. This is done by overriding ``build_args`` like so:
.. code-block:: python
def build_args(self, spec, prefix):
args = [
'PREFIX={0}'.format(prefix),
'ZLIB={0}'.format(spec['zlib'].prefix),
]
if '+debug' in spec:
args.append('DEBUG=yes')
else:
args.append('DEBUG=no')
return args
``SConsPackage`` also provides an ``install_args`` function that you can
override to pass additional arguments to ``scons install``.
^^^^^^^^^^^^^^^^^
Compiler wrappers
^^^^^^^^^^^^^^^^^
By default, SCons builds all packages in a separate execution environment,
and doesn't pass any environment variables from the user environment.
Even changes to ``PATH`` are not propagated unless the package developer
does so.
This is particularly troublesome for Spack's compiler wrappers, which depend
on environment variables to manage dependencies and linking flags. In many
cases, SCons packages are not compatible with Spack's compiler wrappers,
and linking must be done manually.
First of all, check the list of valid options for anything relating to
environment variables. For example, cantera has the following option:
.. code-block:: none
* env_vars: [ string ]
Environment variables to propagate through to SCons. Either the
string "all" or a comma separated list of variable names, e.g.
'LD_LIBRARY_PATH,HOME'.
- default: 'LD_LIBRARY_PATH,PYTHONPATH'
In the case of cantera, using ``env_vars=all`` allows us to use
Spack's compiler wrappers. If you don't see an option related to
environment variables, try using Spack's compiler wrappers by passing
``spack_cc``, ``spack_cxx``, and ``spack_fc`` via the ``CC``, ``CXX``,
and ``FC`` arguments, respectively. If you pass them to the build and
you see an error message like:
.. code-block:: none
Spack compiler must be run from Spack! Input 'SPACK_PREFIX' is missing.
you'll know that the package isn't compatible with Spack's compiler
wrappers. In this case, you'll have to use the path to the actual
compilers, which are stored in ``self.compiler.cc`` and friends.
Note that this may involve passing additional flags to the build to
locate dependencies, a task normally done by the compiler wrappers.
serf is an example of a package with this limitation.
^^^^^^^^^^^^^^^^^^^^^^
External documentation
^^^^^^^^^^^^^^^^^^^^^^
For more information on the SCons build system, see:
http://scons.org/documentation.html
.. _wafpackage:
----------
WafPackage
----------
Like SCons, Waf is a general-purpose build system that does not rely
on Makefiles to build software.
^^^^^^
Phases
^^^^^^
The ``WafPackage`` base class comes with the following phases:
#. ``configure`` - configure the project
#. ``build`` - build the project
#. ``install`` - install the project
By default, these phases run:
.. code-block:: console
$ python waf configure --prefix=/path/to/installation/prefix
$ python waf build
$ python waf install
Each of these are standard Waf commands and can be found by running:
.. code-block:: console
$ python waf --help
Each phase provides a ``<phase>`` function that runs:
.. code-block:: console
$ python waf -j<jobs> <phase>
where ``<jobs>`` is the number of parallel jobs to build with. Each phase
also has a ``<phase_args>`` function that can pass arguments to this call.
All of these functions are empty except for the ``configure_args``
function, which passes ``--prefix=/path/to/installation/prefix``.
^^^^^^^
Testing
^^^^^^^
``WafPackage`` also provides ``test`` and ``installtest`` methods,
which are run after the ``build`` and ``install`` phases, respectively.
By default, these phases do nothing, but you can override them to
run package-specific unit tests. For example, the
`py-py2cairo <https://github.com/spack/spack/blob/develop/var/spack/repos/builtin/packages/py-py2cairo/package.py>`_
package uses:
.. code-block:: python
def installtest(self):
with working_dir('test'):
pytest = which('py.test')
pytest()
^^^^^^^^^^^^^^^
Important files
^^^^^^^^^^^^^^^
Each Waf package comes with a custom ``waf`` build script, written in
Python. This script contains instructions to build the project.
The package also comes with a ``wscript`` file. This file is used to
override the default ``configure``, ``build``, and ``install`` phases
to customize the Waf project. It also allows developers to override
the default ``./waf --help`` message. Check this file to find useful
information about dependencies and the minimum versions that are
supported.
^^^^^^^^^^^^^^^^^^^^^^^^^
Build system dependencies
^^^^^^^^^^^^^^^^^^^^^^^^^
``WafPackage`` does not require ``waf`` to build. ``waf`` is only
needed to create the ``./waf`` script. Since ``./waf`` is a Python
script, Python is needed to build the project. ``WafPackage`` adds
the following dependency automatically:
.. code-block:: python
depends_on('python@2.5:', type='build')
Waf only supports Python 2.5 and up.
^^^^^^^^^^^^^^^^^^^^^^^^
Passing arguments to waf
^^^^^^^^^^^^^^^^^^^^^^^^
As previously mentioned, each phase comes with a ``<phase_args>``
function that can be used to pass arguments to that particular
phase. For example, if you need to pass arguments to the build
phase, you can use:
.. code-block:: python
def build_args(self, spec, prefix):
args = []
if self.run_tests:
args.append('--test')
return args
A list of valid options can be found by running ``./waf --help``.
^^^^^^^^^^^^^^^^^^^^^^
External documentation
^^^^^^^^^^^^^^^^^^^^^^
For more information on the Waf build system, see:
https://waf.io/book/
...@@ -73,6 +73,7 @@ or refer to the full manual below. ...@@ -73,6 +73,7 @@ or refer to the full manual below.
contribution_guide contribution_guide
packaging_guide packaging_guide
build_systems
developer_guide developer_guide
docker_for_developers docker_for_developers
Spack API Docs <spack> Spack API Docs <spack>
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment