Skip to content
Snippets Groups Projects
Commit 497ed9f9 authored by Massimiliano Culpo's avatar Massimiliano Culpo
Browse files

Merge branch 'develop' of https://github.com/LLNL/spack into openmpi_variants

Conflicts:
	var/spack/repos/builtin/packages/openmpi/package.py
parents 6cffac79 ea408cc0
No related branches found
No related tags found
No related merge requests found
Showing
with 601 additions and 172 deletions
...@@ -372,25 +372,32 @@ how this is done is in :ref:`sec-specs`. ...@@ -372,25 +372,32 @@ how this is done is in :ref:`sec-specs`.
``spack compiler add`` ``spack compiler add``
~~~~~~~~~~~~~~~~~~~~~~~ ~~~~~~~~~~~~~~~~~~~~~~~
An alias for ``spack compiler find``.
.. _spack-compiler-find:
``spack compiler find``
~~~~~~~~~~~~~~~~~~~~~~~
If you do not see a compiler in this list, but you want to use it with If you do not see a compiler in this list, but you want to use it with
Spack, you can simply run ``spack compiler add`` with the path to Spack, you can simply run ``spack compiler find`` with the path to
where the compiler is installed. For example:: where the compiler is installed. For example::
$ spack compiler add /usr/local/tools/ic-13.0.079 $ spack compiler find /usr/local/tools/ic-13.0.079
==> Added 1 new compiler to /Users/gamblin2/.spack/compilers.yaml ==> Added 1 new compiler to /Users/gamblin2/.spack/compilers.yaml
intel@13.0.079 intel@13.0.079
Or you can run ``spack compiler add`` with no arguments to force Or you can run ``spack compiler find`` with no arguments to force
auto-detection. This is useful if you do not know where compilers are auto-detection. This is useful if you do not know where compilers are
installed, but you know that new compilers have been added to your installed, but you know that new compilers have been added to your
``PATH``. For example, using dotkit, you might do this:: ``PATH``. For example, using dotkit, you might do this::
$ module load gcc-4.9.0 $ module load gcc-4.9.0
$ spack compiler add $ spack compiler find
==> Added 1 new compiler to /Users/gamblin2/.spack/compilers.yaml ==> Added 1 new compiler to /Users/gamblin2/.spack/compilers.yaml
gcc@4.9.0 gcc@4.9.0
This loads the environment module for gcc-4.9.0 to get it into the This loads the environment module for gcc-4.9.0 to add it to
``PATH``, and then it adds the compiler to Spack. ``PATH``, and then it adds the compiler to Spack.
.. _spack-compiler-info: .. _spack-compiler-info:
...@@ -807,17 +814,22 @@ Environment Modules, you can get it with Spack: ...@@ -807,17 +814,22 @@ Environment Modules, you can get it with Spack:
1. Install with:: 1. Install with::
.. code-block:: sh
spack install environment-modules spack install environment-modules
2. Activate with:: 2. Activate with::
MODULES_HOME=`spack location -i environment-modules` Add the following two lines to your ``.bashrc`` profile (or similar):
MODULES_VERSION=`ls -1 $MODULES_HOME/Modules | head -1`
${MODULES_HOME}/Modules/${MODULES_VERSION}/bin/add.modules .. code-block:: sh
MODULES_HOME=`spack location -i environment-modules`
source ${MODULES_HOME}/Modules/init/bash
In case you use a Unix shell other than bash, substitute ``bash`` by
the appropriate file in ``${MODULES_HOME}/Modules/init/``.
This adds to your ``.bashrc`` (or similar) files, enabling Environment
Modules when you log in. It will ask your permission before changing
any files.
Spack and Environment Modules Spack and Environment Modules
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
......
...@@ -1831,6 +1831,23 @@ successfully find ``libdwarf.h`` and ``libdwarf.so``, without the ...@@ -1831,6 +1831,23 @@ successfully find ``libdwarf.h`` and ``libdwarf.so``, without the
packager having to provide ``--with-libdwarf=/path/to/libdwarf`` on packager having to provide ``--with-libdwarf=/path/to/libdwarf`` on
the command line. the command line.
Message Parsing Interface (MPI)
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
It is common for high performance computing software/packages to use ``MPI``.
As a result of conretization, a given package can be built using different
implementations of MPI such as ``Openmpi``, ``MPICH`` or ``IntelMPI``.
In some scenarios to configure a package one have to provide it with appropriate MPI
compiler wrappers such as ``mpicc``, ``mpic++``.
However different implementations of ``MPI`` may have different names for those
wrappers. In order to make package's ``install()`` method indifferent to the
choice ``MPI`` implementation, each package which implements ``MPI`` sets up
``self.spec.mpicc``, ``self.spec.mpicxx``, ``self.spec.mpifc`` and ``self.spec.mpif77``
to point to ``C``, ``C++``, ``Fortran 90`` and ``Fortran 77`` ``MPI`` wrappers.
Package developers are advised to use these variables, for example ``self.spec['mpi'].mpicc``
instead of hard-coding ``join_path(self.spec['mpi'].prefix.bin, 'mpicc')`` for
the reasons outlined above.
Forking ``install()`` Forking ``install()``
~~~~~~~~~~~~~~~~~~~~~ ~~~~~~~~~~~~~~~~~~~~~
......
...@@ -162,7 +162,7 @@ fi ...@@ -162,7 +162,7 @@ fi
# It doesn't work with -rpath. # It doesn't work with -rpath.
# This variable controls whether they are added. # This variable controls whether they are added.
add_rpaths=true add_rpaths=true
if [[ mode == ld && $OSTYPE == darwin* ]]; then if [[ $mode == ld && "$SPACK_SHORT_SPEC" =~ "darwin" ]]; then
for arg in "$@"; do for arg in "$@"; do
if [[ $arg == -r ]]; then if [[ $arg == -r ]]; then
add_rpaths=false add_rpaths=false
......
...@@ -136,9 +136,7 @@ ...@@ -136,9 +136,7 @@
# don't add a second username if it's already unique by user. # don't add a second username if it's already unique by user.
if not _tmp_user in path: if not _tmp_user in path:
tmp_dirs.append(join_path(path, '%u', 'spack-stage')) tmp_dirs.append(join_path(path, '%u', 'spack-stage'))
else:
for path in _tmp_candidates:
if not path in tmp_dirs:
tmp_dirs.append(join_path(path, 'spack-stage')) tmp_dirs.append(join_path(path, 'spack-stage'))
# Whether spack should allow installation of unsafe versions of # Whether spack should allow installation of unsafe versions of
......
...@@ -44,10 +44,10 @@ def setup_parser(subparser): ...@@ -44,10 +44,10 @@ def setup_parser(subparser):
scopes = spack.config.config_scopes scopes = spack.config.config_scopes
# Add # Find
add_parser = sp.add_parser('add', help='Add compilers to the Spack configuration.') find_parser = sp.add_parser('find', aliases=['add'], help='Search the system for compilers to add to the Spack configuration.')
add_parser.add_argument('add_paths', nargs=argparse.REMAINDER) find_parser.add_argument('add_paths', nargs=argparse.REMAINDER)
add_parser.add_argument('--scope', choices=scopes, default=spack.cmd.default_modify_scope, find_parser.add_argument('--scope', choices=scopes, default=spack.cmd.default_modify_scope,
help="Configuration scope to modify.") help="Configuration scope to modify.")
# Remove # Remove
...@@ -70,7 +70,7 @@ def setup_parser(subparser): ...@@ -70,7 +70,7 @@ def setup_parser(subparser):
help="Configuration scope to read from.") help="Configuration scope to read from.")
def compiler_add(args): def compiler_find(args):
"""Search either $PATH or a list of paths for compilers and add them """Search either $PATH or a list of paths for compilers and add them
to Spack's configuration.""" to Spack's configuration."""
paths = args.add_paths paths = args.add_paths
...@@ -136,7 +136,8 @@ def compiler_list(args): ...@@ -136,7 +136,8 @@ def compiler_list(args):
def compiler(parser, args): def compiler(parser, args):
action = { 'add' : compiler_add, action = { 'add' : compiler_find,
'find' : compiler_find,
'remove' : compiler_remove, 'remove' : compiler_remove,
'rm' : compiler_remove, 'rm' : compiler_remove,
'info' : compiler_info, 'info' : compiler_info,
......
...@@ -23,87 +23,106 @@ ...@@ -23,87 +23,106 @@
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA # Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
############################################################################## ##############################################################################
import argparse import argparse
import xml.etree.ElementTree as ET
import itertools
import re
import os
import codecs import codecs
import os
import time
import xml.dom.minidom
import xml.etree.ElementTree as ET
import llnl.util.tty as tty import llnl.util.tty as tty
from llnl.util.filesystem import *
import spack import spack
import spack.cmd
from llnl.util.filesystem import *
from spack.build_environment import InstallError from spack.build_environment import InstallError
from spack.fetch_strategy import FetchError from spack.fetch_strategy import FetchError
import spack.cmd
description = "Run package installation as a unit test, output formatted results." description = "Run package installation as a unit test, output formatted results."
def setup_parser(subparser):
subparser.add_argument(
'-j', '--jobs', action='store', type=int,
help="Explicitly set number of make jobs. Default is #cpus.")
subparser.add_argument(
'-n', '--no-checksum', action='store_true', dest='no_checksum',
help="Do not check packages against checksum")
subparser.add_argument( def setup_parser(subparser):
'-o', '--output', action='store', help="test output goes in this file") subparser.add_argument('-j',
'--jobs',
action='store',
type=int,
help="Explicitly set number of make jobs. Default is #cpus.")
subparser.add_argument( subparser.add_argument('-n',
'package', nargs=argparse.REMAINDER, help="spec of package to install") '--no-checksum',
action='store_true',
dest='no_checksum',
help="Do not check packages against checksum")
subparser.add_argument('-o', '--output', action='store', help="test output goes in this file")
class JunitResultFormat(object): subparser.add_argument('package', nargs=argparse.REMAINDER, help="spec of package to install")
def __init__(self):
self.root = ET.Element('testsuite')
self.tests = []
def add_test(self, buildId, testResult, buildInfo=None):
self.tests.append((buildId, testResult, buildInfo))
def write_to(self, stream):
self.root.set('tests', '{0}'.format(len(self.tests)))
for buildId, testResult, buildInfo in self.tests:
testcase = ET.SubElement(self.root, 'testcase')
testcase.set('classname', buildId.name)
testcase.set('name', buildId.stringId())
if testResult == TestResult.FAILED:
failure = ET.SubElement(testcase, 'failure')
failure.set('type', "Build Error")
failure.text = buildInfo
elif testResult == TestResult.SKIPPED:
skipped = ET.SubElement(testcase, 'skipped')
skipped.set('type', "Skipped Build")
skipped.text = buildInfo
ET.ElementTree(self.root).write(stream)
class TestResult(object): class TestResult(object):
PASSED = 0 PASSED = 0
FAILED = 1 FAILED = 1
SKIPPED = 2 SKIPPED = 2
ERRORED = 3
class BuildId(object): class TestSuite(object):
def __init__(self, spec): def __init__(self, filename):
self.name = spec.name self.filename = filename
self.version = spec.version self.root = ET.Element('testsuite')
self.hashId = spec.dag_hash() self.tests = []
def stringId(self):
return "-".join(str(x) for x in (self.name, self.version, self.hashId))
def __hash__(self):
return hash((self.name, self.version, self.hashId))
def __eq__(self, other):
if not isinstance(other, BuildId):
return False
return ((self.name, self.version, self.hashId) == def __enter__(self):
(other.name, other.version, other.hashId)) return self
def append(self, item):
if not isinstance(item, TestCase):
raise TypeError('only TestCase instances may be appended to a TestSuite instance')
self.tests.append(item) # Append the item to the list of tests
def __exit__(self, exc_type, exc_val, exc_tb):
# Prepare the header for the entire test suite
number_of_errors = sum(x.result_type == TestResult.ERRORED for x in self.tests)
self.root.set('errors', str(number_of_errors))
number_of_failures = sum(x.result_type == TestResult.FAILED for x in self.tests)
self.root.set('failures', str(number_of_failures))
self.root.set('tests', str(len(self.tests)))
for item in self.tests:
self.root.append(item.element)
with open(self.filename, 'wb') as file:
xml_string = ET.tostring(self.root)
xml_string = xml.dom.minidom.parseString(xml_string).toprettyxml()
file.write(xml_string)
class TestCase(object):
results = {
TestResult.PASSED: None,
TestResult.SKIPPED: 'skipped',
TestResult.FAILED: 'failure',
TestResult.ERRORED: 'error',
}
def __init__(self, classname, name, time=None):
self.element = ET.Element('testcase')
self.element.set('classname', str(classname))
self.element.set('name', str(name))
if time is not None:
self.element.set('time', str(time))
self.result_type = None
def set_result(self, result_type, message=None, error_type=None, text=None):
self.result_type = result_type
result = TestCase.results[self.result_type]
if result is not None and result is not TestResult.PASSED:
subelement = ET.SubElement(self.element, result)
if error_type is not None:
subelement.set('type', error_type)
if message is not None:
subelement.set('message', str(message))
if text is not None:
subelement.text = text
def fetch_log(path): def fetch_log(path):
...@@ -114,46 +133,76 @@ def fetch_log(path): ...@@ -114,46 +133,76 @@ def fetch_log(path):
def failed_dependencies(spec): def failed_dependencies(spec):
return set(childSpec for childSpec in spec.dependencies.itervalues() if not return set(item for item in spec.dependencies.itervalues() if not spack.repo.get(item).installed)
spack.repo.get(childSpec).installed)
def get_top_spec_or_die(args):
def create_test_output(topSpec, newInstalls, output, getLogFunc=fetch_log): specs = spack.cmd.parse_specs(args.package, concretize=True)
# Post-order traversal is not strictly required but it makes sense to output if len(specs) > 1:
# tests for dependencies first. tty.die("Only 1 top-level package can be specified")
for spec in topSpec.traverse(order='post'): top_spec = iter(specs).next()
if spec not in newInstalls: return top_spec
continue
failedDeps = failed_dependencies(spec) def install_single_spec(spec, number_of_jobs):
package = spack.repo.get(spec) package = spack.repo.get(spec)
if failedDeps:
result = TestResult.SKIPPED # If it is already installed, skip the test
dep = iter(failedDeps).next() if spack.repo.get(spec).installed:
depBID = BuildId(dep) testcase = TestCase(package.name, package.spec.short_spec, time=0.0)
errOutput = "Skipped due to failed dependency: {0}".format( testcase.set_result(TestResult.SKIPPED, message='Skipped [already installed]', error_type='already_installed')
depBID.stringId()) return testcase
elif (not package.installed) and (not package.stage.source_path):
result = TestResult.FAILED # If it relies on dependencies that did not install, skip
errOutput = "Failure to fetch package resources." if failed_dependencies(spec):
elif not package.installed: testcase = TestCase(package.name, package.spec.short_spec, time=0.0)
result = TestResult.FAILED testcase.set_result(TestResult.SKIPPED, message='Skipped [failed dependencies]', error_type='dep_failed')
lines = getLogFunc(package.build_log_path) return testcase
errMessages = list(line for line in lines if
re.search('error:', line, re.IGNORECASE)) # Otherwise try to install the spec
errOutput = errMessages if errMessages else lines[-10:] try:
errOutput = '\n'.join(itertools.chain( start_time = time.time()
[spec.to_yaml(), "Errors:"], errOutput, package.do_install(keep_prefix=False,
["Build Log:", package.build_log_path])) keep_stage=True,
else: ignore_deps=False,
result = TestResult.PASSED make_jobs=number_of_jobs,
errOutput = None verbose=True,
fake=False)
bId = BuildId(spec) duration = time.time() - start_time
output.add_test(bId, result, errOutput) testcase = TestCase(package.name, package.spec.short_spec, duration)
testcase.set_result(TestResult.PASSED)
except InstallError:
# An InstallError is considered a failure (the recipe didn't work correctly)
duration = time.time() - start_time
# Try to get the log
lines = fetch_log(package.build_log_path)
text = '\n'.join(lines)
testcase = TestCase(package.name, package.spec.short_spec, duration)
testcase.set_result(TestResult.FAILED, message='Installation failure', text=text)
except FetchError:
# A FetchError is considered an error (we didn't even start building)
duration = time.time() - start_time
testcase = TestCase(package.name, package.spec.short_spec, duration)
testcase.set_result(TestResult.ERRORED, message='Unable to fetch package')
return testcase
def get_filename(args, top_spec):
if not args.output:
fname = 'test-{x.name}-{x.version}-{hash}.xml'.format(x=top_spec, hash=top_spec.dag_hash())
output_directory = join_path(os.getcwd(), 'test-output')
if not os.path.exists(output_directory):
os.mkdir(output_directory)
output_filename = join_path(output_directory, fname)
else:
output_filename = args.output
return output_filename
def test_install(parser, args): def test_install(parser, args):
# Check the input
if not args.package: if not args.package:
tty.die("install requires a package argument") tty.die("install requires a package argument")
...@@ -162,50 +211,15 @@ def test_install(parser, args): ...@@ -162,50 +211,15 @@ def test_install(parser, args):
tty.die("The -j option must be a positive integer!") tty.die("The -j option must be a positive integer!")
if args.no_checksum: if args.no_checksum:
spack.do_checksum = False # TODO: remove this global. spack.do_checksum = False # TODO: remove this global.
specs = spack.cmd.parse_specs(args.package, concretize=True) # Get the one and only top spec
if len(specs) > 1: top_spec = get_top_spec_or_die(args)
tty.die("Only 1 top-level package can be specified") # Get the filename of the test
topSpec = iter(specs).next() output_filename = get_filename(args, top_spec)
# TEST SUITE
newInstalls = set() with TestSuite(output_filename) as test_suite:
for spec in topSpec.traverse(): # Traverse in post order : each spec is a test case
package = spack.repo.get(spec) for spec in top_spec.traverse(order='post'):
if not package.installed: test_case = install_single_spec(spec, args.jobs)
newInstalls.add(spec) test_suite.append(test_case)
if not args.output:
bId = BuildId(topSpec)
outputDir = join_path(os.getcwd(), "test-output")
if not os.path.exists(outputDir):
os.mkdir(outputDir)
outputFpath = join_path(outputDir, "test-{0}.xml".format(bId.stringId()))
else:
outputFpath = args.output
for spec in topSpec.traverse(order='post'):
# Calling do_install for the top-level package would be sufficient but
# this attempts to keep going if any package fails (other packages which
# are not dependents may succeed)
package = spack.repo.get(spec)
if (not failed_dependencies(spec)) and (not package.installed):
try:
package.do_install(
keep_prefix=False,
keep_stage=True,
ignore_deps=False,
make_jobs=args.jobs,
verbose=True,
fake=False)
except InstallError:
pass
except FetchError:
pass
jrf = JunitResultFormat()
handled = {}
create_test_output(topSpec, newInstalls, jrf)
with open(outputFpath, 'wb') as F:
jrf.write_to(F)
...@@ -97,6 +97,9 @@ class Compiler(object): ...@@ -97,6 +97,9 @@ class Compiler(object):
# argument used to get C++11 options # argument used to get C++11 options
cxx11_flag = "-std=c++11" cxx11_flag = "-std=c++11"
# argument used to get C++14 options
cxx14_flag = "-std=c++1y"
def __init__(self, cspec, cc, cxx, f77, fc): def __init__(self, cspec, cc, cxx, f77, fc):
def check(exe): def check(exe):
......
...@@ -54,9 +54,16 @@ def cxx11_flag(self): ...@@ -54,9 +54,16 @@ def cxx11_flag(self):
if self.version < ver('4.3'): if self.version < ver('4.3'):
tty.die("Only gcc 4.3 and above support c++11.") tty.die("Only gcc 4.3 and above support c++11.")
elif self.version < ver('4.7'): elif self.version < ver('4.7'):
return "-std=gnu++0x" return "-std=c++0x"
else: else:
return "-std=gnu++11" return "-std=c++11"
@property
def cxx14_flag(self):
if self.version < ver('4.8'):
tty.die("Only gcc 4.8 and above support c++14.")
else:
return "-std=c++14"
@classmethod @classmethod
def fc_version(cls, fc): def fc_version(cls, fc):
......
...@@ -157,12 +157,26 @@ def fetch(self): ...@@ -157,12 +157,26 @@ def fetch(self):
tty.msg("Already downloaded %s" % self.archive_file) tty.msg("Already downloaded %s" % self.archive_file)
return return
possible_files = self.stage.expected_archive_files
save_file = None
partial_file = None
if possible_files:
save_file = self.stage.expected_archive_files[0]
partial_file = self.stage.expected_archive_files[0] + '.part'
tty.msg("Trying to fetch from %s" % self.url) tty.msg("Trying to fetch from %s" % self.url)
curl_args = ['-O', # save file to disk if partial_file:
save_args = ['-C', '-', # continue partial downloads
'-o', partial_file] # use a .part file
else:
save_args = ['-O']
curl_args = save_args + [
'-f', # fail on >400 errors '-f', # fail on >400 errors
'-D', '-', # print out HTML headers '-D', '-', # print out HTML headers
'-L', self.url, ] '-L', # resolve 3xx redirects
self.url, ]
if sys.stdout.isatty(): if sys.stdout.isatty():
curl_args.append('-#') # status bar when using a tty curl_args.append('-#') # status bar when using a tty
...@@ -178,6 +192,9 @@ def fetch(self): ...@@ -178,6 +192,9 @@ def fetch(self):
if self.archive_file: if self.archive_file:
os.remove(self.archive_file) os.remove(self.archive_file)
if partial_file and os.path.exists(partial_file):
os.remove(partial_file)
if spack.curl.returncode == 22: if spack.curl.returncode == 22:
# This is a 404. Curl will print the error. # This is a 404. Curl will print the error.
raise FailedDownloadError( raise FailedDownloadError(
...@@ -209,6 +226,9 @@ def fetch(self): ...@@ -209,6 +226,9 @@ def fetch(self):
"'spack clean <package>' to remove the bad archive, then fix", "'spack clean <package>' to remove the bad archive, then fix",
"your internet gateway issue and install again.") "your internet gateway issue and install again.")
if save_file:
os.rename(partial_file, save_file)
if not self.archive_file: if not self.archive_file:
raise FailedDownloadError(self.url) raise FailedDownloadError(self.url)
......
...@@ -210,6 +210,18 @@ def _need_to_create_path(self): ...@@ -210,6 +210,18 @@ def _need_to_create_path(self):
return False return False
@property
def expected_archive_files(self):
"""Possible archive file paths."""
paths = []
if isinstance(self.fetcher, fs.URLFetchStrategy):
paths.append(os.path.join(self.path, os.path.basename(self.fetcher.url)))
if self.mirror_path:
paths.append(os.path.join(self.path, os.path.basename(self.mirror_path)))
return paths
@property @property
def archive_file(self): def archive_file(self):
"""Path to the source archive within this stage directory.""" """Path to the source archive within this stage directory."""
......
...@@ -61,14 +61,14 @@ ...@@ -61,14 +61,14 @@
'optional_deps', 'optional_deps',
'make_executable', 'make_executable',
'configure_guess', 'configure_guess',
'unit_install',
'lock', 'lock',
'database', 'database',
'namespace_trie', 'namespace_trie',
'yaml', 'yaml',
'sbang', 'sbang',
'environment', 'environment',
'cmd.uninstall'] 'cmd.uninstall',
'cmd.test_install']
def list_tests(): def list_tests():
......
...@@ -219,3 +219,27 @@ def test_ld_deps(self): ...@@ -219,3 +219,27 @@ def test_ld_deps(self):
' '.join(test_command)) ' '.join(test_command))
def test_ld_deps_reentrant(self):
"""Make sure ld -r is handled correctly on OS's where it doesn't
support rpaths."""
os.environ['SPACK_DEPENDENCIES'] = ':'.join([self.dep1])
os.environ['SPACK_SHORT_SPEC'] = "foo@1.2=linux-x86_64"
reentrant_test_command = ['-r'] + test_command
self.check_ld('dump-args', reentrant_test_command,
'ld ' +
'-rpath ' + self.prefix + '/lib ' +
'-rpath ' + self.prefix + '/lib64 ' +
'-L' + self.dep1 + '/lib ' +
'-rpath ' + self.dep1 + '/lib ' +
'-r ' +
' '.join(test_command))
os.environ['SPACK_SHORT_SPEC'] = "foo@1.2=darwin-x86_64"
self.check_ld('dump-args', reentrant_test_command,
'ld ' +
'-L' + self.dep1 + '/lib ' +
'-r ' +
' '.join(test_command))
...@@ -22,23 +22,39 @@ ...@@ -22,23 +22,39 @@
# along with this program; if not, write to the Free Software Foundation, # along with this program; if not, write to the Free Software Foundation,
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA # Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
############################################################################## ##############################################################################
import collections
from contextlib import contextmanager
import StringIO
FILE_REGISTRY = collections.defaultdict(StringIO.StringIO)
# Monkey-patch open to write module files to a StringIO instance
@contextmanager
def mock_open(filename, mode):
if not mode == 'wb':
raise RuntimeError('test.test_install : unexpected opening mode for monkey-patched open')
FILE_REGISTRY[filename] = StringIO.StringIO()
try:
yield FILE_REGISTRY[filename]
finally:
handle = FILE_REGISTRY[filename]
FILE_REGISTRY[filename] = handle.getvalue()
handle.close()
import os
import itertools import itertools
import unittest import unittest
import spack import spack
import spack.cmd
test_install = __import__("spack.cmd.test-install",
fromlist=["BuildId", "create_test_output", "TestResult"])
class MockOutput(object):
def __init__(self):
self.results = {}
def add_test(self, buildId, passed=True, buildInfo=None): # The use of __import__ is necessary to maintain a name with hyphen (which cannot be an identifier in python)
self.results[buildId] = passed test_install = __import__("spack.cmd.test-install", fromlist=['test_install'])
def write_to(self, stream):
pass
class MockSpec(object): class MockSpec(object):
def __init__(self, name, version, hashStr=None): def __init__(self, name, version, hashStr=None):
...@@ -48,79 +64,127 @@ def __init__(self, name, version, hashStr=None): ...@@ -48,79 +64,127 @@ def __init__(self, name, version, hashStr=None):
self.hash = hashStr if hashStr else hash((name, version)) self.hash = hashStr if hashStr else hash((name, version))
def traverse(self, order=None): def traverse(self, order=None):
allDeps = itertools.chain.from_iterable(i.traverse() for i in for _, spec in self.dependencies.items():
self.dependencies.itervalues()) yield spec
return set(itertools.chain([self], allDeps)) yield self
#allDeps = itertools.chain.from_iterable(i.traverse() for i in self.dependencies.itervalues())
#return set(itertools.chain([self], allDeps))
def dag_hash(self): def dag_hash(self):
return self.hash return self.hash
def to_yaml(self): @property
return "<<<MOCK YAML {0}>>>".format(test_install.BuildId(self).stringId()) def short_spec(self):
return '-'.join([self.name, str(self.version), str(self.hash)])
class MockPackage(object): class MockPackage(object):
def __init__(self, buildLogPath): def __init__(self, spec, buildLogPath):
self.name = spec.name
self.spec = spec
self.installed = False self.installed = False
self.build_log_path = buildLogPath self.build_log_path = buildLogPath
specX = MockSpec("X", "1.2.0") def do_install(self, *args, **kwargs):
specY = MockSpec("Y", "2.3.8") self.installed = True
class MockPackageDb(object):
def __init__(self, init=None):
self.specToPkg = {}
if init:
self.specToPkg.update(init)
def get(self, spec):
return self.specToPkg[spec]
def mock_fetch_log(path):
return []
specX = MockSpec('X', "1.2.0")
specY = MockSpec('Y', "2.3.8")
specX.dependencies['Y'] = specY specX.dependencies['Y'] = specY
pkgX = MockPackage('logX') pkgX = MockPackage(specX, 'logX')
pkgY = MockPackage('logY') pkgY = MockPackage(specY, 'logY')
bIdX = test_install.BuildId(specX)
bIdY = test_install.BuildId(specY)
class UnitInstallTest(unittest.TestCase): class MockArgs(object):
"""Tests test-install where X->Y""" def __init__(self, package):
self.package = package
self.jobs = None
self.no_checksum = False
self.output = None
# TODO: add test(s) where Y fails to install
class TestInstallTest(unittest.TestCase):
"""
Tests test-install where X->Y
"""
def setUp(self): def setUp(self):
super(UnitInstallTest, self).setUp() super(TestInstallTest, self).setUp()
# Monkey patch parse specs
def monkey_parse_specs(x, concretize):
if x == 'X':
return [specX]
elif x == 'Y':
return [specY]
return []
self.parse_specs = spack.cmd.parse_specs
spack.cmd.parse_specs = monkey_parse_specs
# Monkey patch os.mkdirp
self.os_mkdir = os.mkdir
os.mkdir = lambda x: True
# Monkey patch open
test_install.open = mock_open
# Clean FILE_REGISTRY
FILE_REGISTRY = collections.defaultdict(StringIO.StringIO)
pkgX.installed = False pkgX.installed = False
pkgY.installed = False pkgY.installed = False
# Monkey patch pkgDb
self.saved_db = spack.repo self.saved_db = spack.repo
pkgDb = MockPackageDb({specX:pkgX, specY:pkgY}) pkgDb = MockPackageDb({specX: pkgX, specY: pkgY})
spack.repo = pkgDb spack.repo = pkgDb
def tearDown(self): def tearDown(self):
super(UnitInstallTest, self).tearDown() # Remove the monkey patched test_install.open
test_install.open = open
spack.repo = self.saved_db
def test_installing_both(self): # Remove the monkey patched os.mkdir
mo = MockOutput() os.mkdir = self.os_mkdir
del self.os_mkdir
pkgX.installed = True # Remove the monkey patched parse_specs
pkgY.installed = True spack.cmd.parse_specs = self.parse_specs
test_install.create_test_output(specX, [specX, specY], mo, getLogFunc=mock_fetch_log) del self.parse_specs
super(TestInstallTest, self).tearDown()
self.assertEqual(mo.results, spack.repo = self.saved_db
{bIdX:test_install.TestResult.PASSED,
bIdY:test_install.TestResult.PASSED})
def test_installing_both(self):
test_install.test_install(None, MockArgs('X') )
self.assertEqual(len(FILE_REGISTRY), 1)
for _, content in FILE_REGISTRY.items():
self.assertTrue('tests="2"' in content)
self.assertTrue('failures="0"' in content)
self.assertTrue('errors="0"' in content)
def test_dependency_already_installed(self): def test_dependency_already_installed(self):
mo = MockOutput()
pkgX.installed = True pkgX.installed = True
pkgY.installed = True pkgY.installed = True
test_install.create_test_output(specX, [specX], mo, getLogFunc=mock_fetch_log) test_install.test_install(None, MockArgs('X'))
self.assertEqual(mo.results, {bIdX:test_install.TestResult.PASSED}) self.assertEqual(len(FILE_REGISTRY), 1)
for _, content in FILE_REGISTRY.items():
#TODO: add test(s) where Y fails to install self.assertTrue('tests="2"' in content)
self.assertTrue('failures="0"' in content)
self.assertTrue('errors="0"' in content)
class MockPackageDb(object): self.assertEqual(sum('skipped' in line for line in content.split('\n')), 2)
def __init__(self, init=None):
self.specToPkg = {}
if init:
self.specToPkg.update(init)
def get(self, spec):
return self.specToPkg[spec]
def mock_fetch_log(path):
return []
############################################################################## #####################################################################
# Copyright (c) 2013, Lawrence Livermore National Security, LLC. # Copyright (c) 2013, Lawrence Livermore National Security, LLC.
# Produced at the Lawrence Livermore National Laboratory. # Produced at the Lawrence Livermore National Laboratory.
# #
...@@ -84,7 +84,10 @@ function spack { ...@@ -84,7 +84,10 @@ function spack {
if [ "$_sp_arg" = "-h" ]; then if [ "$_sp_arg" = "-h" ]; then
command spack cd -h command spack cd -h
else else
cd $(spack location $_sp_arg "$@") LOC="$(spack location $_sp_arg "$@")"
if [[ -d "$LOC" ]] ; then
cd "$LOC"
fi
fi fi
return return
;; ;;
......
import os
from spack import *
class Luajit(Package):
"""Flast flexible JITed lua"""
homepage = "http://www.luajit.org"
url = "http://luajit.org/download/LuaJIT-2.0.4.tar.gz"
version('2.0.4', 'dd9c38307f2223a504cbfb96e477eca0')
def install(self, spec, prefix):
# Linking with the C++ compiler is a dirty hack to deal with the fact
# that unwinding symbols are not included by libc, this is necessary
# on some platforms for the final link stage to work
make("install", "PREFIX=" + prefix, "TARGET_LD=" + os.environ['CXX'])
diff --git a/ADOL-C/examples/additional_examples/openmp_exam/liborpar.cpp b/ADOL-C/examples/additional_examples/openmp_exam/liborpar.cpp
index fc6fc28..14103d2 100644
--- a/ADOL-C/examples/additional_examples/openmp_exam/liborpar.cpp
+++ b/ADOL-C/examples/additional_examples/openmp_exam/liborpar.cpp
@@ -27,7 +27,7 @@ using namespace std;
#include <ctime>
#include <cmath>
-#include "adolc.h"
+#include <adolc/adolc.h>
#ifdef _OPENMP
#include <omp.h>
from spack import *
import sys
class AdolC(Package):
"""A package for the automatic differentiation of first and higher derivatives of vector functions in C and C++ programs by operator overloading."""
homepage = "https://projects.coin-or.org/ADOL-C"
url = "http://www.coin-or.org/download/source/ADOL-C/ADOL-C-2.6.1.tgz"
version('head', svn='https://projects.coin-or.org/svn/ADOL-C/trunk/')
version('2.6.1', '1032b28427d6e399af4610e78c0f087b')
variant('doc', default=True, description='Install documentation')
variant('openmp', default=False, description='Enable OpenMP support')
variant('sparse', default=False, description='Enable sparse drivers')
variant('tests', default=True, description='Build all included examples as a test case')
patch('openmp_exam.patch')
def install(self, spec, prefix):
make_args = ['--prefix=%s' % prefix]
# --with-cflags=FLAGS use CFLAGS=FLAGS (default: -O3 -Wall -ansi)
# --with-cxxflags=FLAGS use CXXFLAGS=FLAGS (default: -O3 -Wall)
if '+openmp' in spec:
if spec.satisfies('%gcc'):
make_args.extend([
'--with-openmp-flag=-fopenmp' # FIXME: Is this required? -I <path to omp.h> -L <LLVM OpenMP library path>
])
else:
raise InstallError("OpenMP flags for compilers other than GCC are not implemented.")
if '+sparse' in spec:
make_args.extend([
'--enable-sparse'
])
# We can simply use the bundled examples to check
# whether Adol-C works as expected
if '+tests' in spec:
make_args.extend([
'--enable-docexa', # Documeted examples
'--enable-addexa' # Additional examples
])
if '+openmp' in spec:
make_args.extend([
'--enable-parexa' # Parallel examples
])
configure(*make_args)
make()
make("install")
# Copy the config.h file, as some packages might require it
source_directory = self.stage.source_path
config_h = join_path(source_directory,'ADOL-C','src','config.h')
install(config_h, join_path(prefix.include,'adolc'))
# Install documentation to {prefix}/share
if '+doc' in spec:
install_tree(join_path('ADOL-C','doc'),
join_path(prefix.share,'doc'))
# Install examples to {prefix}/share
if '+tests' in spec:
install_tree(join_path('ADOL-C','examples'),
join_path(prefix.share,'examples'))
# Run some examples that don't require user input
# TODO: Check that bundled examples produce the correct results
with working_dir(join_path(source_directory,'ADOL-C','examples')):
Executable('./tapeless_scalar')()
Executable('./tapeless_vector')()
with working_dir(join_path(source_directory,'ADOL-C','examples','additional_examples')):
Executable('./checkpointing/checkpointing')()
if '+openmp' in spec:
with working_dir(join_path(source_directory,'ADOL-C','examples','additional_examples')):
Executable('./checkpointing/checkpointing')()
...@@ -14,4 +14,5 @@ def install(self, spec, prefix): ...@@ -14,4 +14,5 @@ def install(self, spec, prefix):
make('-f', make('-f',
join_path(self.stage.source_path,'build','clang','Makefile'), join_path(self.stage.source_path,'build','clang','Makefile'),
parallel=False) parallel=False)
mkdirp(self.prefix.bin)
install(join_path(self.stage.source_path, 'src','bin','astyle'), self.prefix.bin) install(join_path(self.stage.source_path, 'src','bin','astyle'), self.prefix.bin)
...@@ -8,6 +8,8 @@ class Autoconf(Package): ...@@ -8,6 +8,8 @@ class Autoconf(Package):
version('2.69', '82d05e03b93e45f5a39b828dc9c6c29b') version('2.69', '82d05e03b93e45f5a39b828dc9c6c29b')
version('2.62', '6c1f3b3734999035d77da5024aab4fbd') version('2.62', '6c1f3b3734999035d77da5024aab4fbd')
depends_on("m4")
def install(self, spec, prefix): def install(self, spec, prefix):
configure("--prefix=%s" % prefix) configure("--prefix=%s" % prefix)
......
from spack import *
class Bbcp(Package):
"""Securely and quickly copy data from source to target"""
homepage = "http://www.slac.stanford.edu/~abh/bbcp/"
version('git', git='http://www.slac.stanford.edu/~abh/bbcp/bbcp.git', branch="master")
def install(self, spec, prefix):
cd("src")
make()
# BBCP wants to build the executable in a directory whose name depends on the system type
makesname = Executable("../MakeSname")
bbcp_executable_path = "../bin/%s/bbcp" % makesname(output=str).rstrip("\n")
destination_path = "%s/bin/" % prefix
mkdirp(destination_path)
install(bbcp_executable_path, destination_path)
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Please register or to comment