mirror of
https://github.com/privatevoid-net/nix-super.git
synced 2024-11-22 22:16:16 +02:00
Merge branch 'best-effort-supplementary-groups' into overlayfs-store
This commit is contained in:
commit
04d5aa02e6
99 changed files with 1416 additions and 213 deletions
1
.gitignore
vendored
1
.gitignore
vendored
|
@ -95,6 +95,7 @@ perl/Makefile.config
|
||||||
# /tests/lang/
|
# /tests/lang/
|
||||||
/tests/lang/*.out
|
/tests/lang/*.out
|
||||||
/tests/lang/*.out.xml
|
/tests/lang/*.out.xml
|
||||||
|
/tests/lang/*.err
|
||||||
/tests/lang/*.ast
|
/tests/lang/*.ast
|
||||||
|
|
||||||
/perl/lib/Nix/Config.pm
|
/perl/lib/Nix/Config.pm
|
||||||
|
|
|
@ -86,6 +86,31 @@ GNU gdb (GDB) 12.1
|
||||||
One can debug the Nix invocation in all the usual ways.
|
One can debug the Nix invocation in all the usual ways.
|
||||||
For example, enter `run` to start the Nix invocation.
|
For example, enter `run` to start the Nix invocation.
|
||||||
|
|
||||||
|
### Characterization testing
|
||||||
|
|
||||||
|
Occasionally, Nix utilizes a technique called [Characterization Testing](https://en.wikipedia.org/wiki/Characterization_test) as part of the functional tests.
|
||||||
|
This technique is to include the exact output/behavior of a former version of Nix in a test in order to check that Nix continues to produce the same behavior going forward.
|
||||||
|
|
||||||
|
For example, this technique is used for the language tests, to check both the printed final value if evaluation was successful, and any errors and warnings encountered.
|
||||||
|
|
||||||
|
It is frequently useful to regenerate the expected output.
|
||||||
|
To do that, rerun the failed test with `_NIX_TEST_ACCEPT=1`.
|
||||||
|
(At least, this is the convention we've used for `tests/lang.sh`.
|
||||||
|
If we add more characterization testing we should always strive to be consistent.)
|
||||||
|
|
||||||
|
An interesting situation to document is the case when these tests are "overfitted".
|
||||||
|
The language tests are, again, an example of this.
|
||||||
|
The expected successful output of evaluation is supposed to be highly stable – we do not intend to make breaking changes to (the stable parts of) the Nix language.
|
||||||
|
However, the errors and warnings during evaluation (successful or not) are not stable in this way.
|
||||||
|
We are free to change how they are displayed at any time.
|
||||||
|
|
||||||
|
It may be surprising that we would test non-normative behavior like diagnostic outputs.
|
||||||
|
Diagnostic outputs are indeed not a stable interface, but they still are important to users.
|
||||||
|
By recording the expected output, the test suite guards against accidental changes, and ensure the *result* (not just the code that implements it) of the diagnostic code paths are under code review.
|
||||||
|
Regressions are caught, and improvements always show up in code review.
|
||||||
|
|
||||||
|
To ensure that characterization testing doesn't make it harder to intentionally change these interfaces, there always must be an easy way to regenerate the expected output, as we do with `_NIX_TEST_ACCEPT=1`.
|
||||||
|
|
||||||
## Integration tests
|
## Integration tests
|
||||||
|
|
||||||
The integration tests are defined in the Nix flake under the `hydraJobs.tests` attribute.
|
The integration tests are defined in the Nix flake under the `hydraJobs.tests` attribute.
|
||||||
|
|
|
@ -3,7 +3,7 @@
|
||||||
This section lists the functions built into the Nix language evaluator.
|
This section lists the functions built into the Nix language evaluator.
|
||||||
All built-in functions are available through the global [`builtins`](./builtin-constants.md#builtins-builtins) constant.
|
All built-in functions are available through the global [`builtins`](./builtin-constants.md#builtins-builtins) constant.
|
||||||
|
|
||||||
For convenience, some built-ins are can be accessed directly:
|
For convenience, some built-ins can be accessed directly:
|
||||||
|
|
||||||
- [`derivation`](#builtins-derivation)
|
- [`derivation`](#builtins-derivation)
|
||||||
- [`import`](#builtins-import)
|
- [`import`](#builtins-import)
|
||||||
|
|
|
@ -2,5 +2,7 @@
|
||||||
|
|
||||||
- [`nix-channel`](../command-ref/nix-channel.md) now supports a `--list-generations` subcommand
|
- [`nix-channel`](../command-ref/nix-channel.md) now supports a `--list-generations` subcommand
|
||||||
|
|
||||||
|
* The function [`builtins.fetchClosure`](../language/builtins.md#builtins-fetchClosure) can now fetch input-addressed paths in [pure evaluation mode](../command-ref/conf-file.md#conf-pure-eval), as those are not impure.
|
||||||
|
|
||||||
- Nix now allows unprivileged/[`allowed-users`](../command-ref/conf-file.md#conf-allowed-users) to sign paths.
|
- Nix now allows unprivileged/[`allowed-users`](../command-ref/conf-file.md#conf-allowed-users) to sign paths.
|
||||||
Previously, only [`trusted-users`](../command-ref/conf-file.md#conf-trusted-users) users could sign paths.
|
Previously, only [`trusted-users`](../command-ref/conf-file.md#conf-trusted-users) users could sign paths.
|
||||||
|
|
|
@ -10,6 +10,7 @@ ConditionPathIsReadWrite=@localstatedir@/nix/daemon-socket
|
||||||
ExecStart=@@bindir@/nix-daemon nix-daemon --daemon
|
ExecStart=@@bindir@/nix-daemon nix-daemon --daemon
|
||||||
KillMode=process
|
KillMode=process
|
||||||
LimitNOFILE=1048576
|
LimitNOFILE=1048576
|
||||||
|
TasksMax=1048576
|
||||||
|
|
||||||
[Install]
|
[Install]
|
||||||
WantedBy=multi-user.target
|
WantedBy=multi-user.target
|
||||||
|
|
|
@ -105,7 +105,9 @@ MixEvalArgs::MixEvalArgs()
|
||||||
)",
|
)",
|
||||||
.category = category,
|
.category = category,
|
||||||
.labels = {"path"},
|
.labels = {"path"},
|
||||||
.handler = {[&](std::string s) { searchPath.push_back(s); }}
|
.handler = {[&](std::string s) {
|
||||||
|
searchPath.elements.emplace_back(SearchPath::Elem::parse(s));
|
||||||
|
}}
|
||||||
});
|
});
|
||||||
|
|
||||||
addFlag({
|
addFlag({
|
||||||
|
|
|
@ -3,6 +3,7 @@
|
||||||
|
|
||||||
#include "args.hh"
|
#include "args.hh"
|
||||||
#include "common-args.hh"
|
#include "common-args.hh"
|
||||||
|
#include "search-path.hh"
|
||||||
|
|
||||||
namespace nix {
|
namespace nix {
|
||||||
|
|
||||||
|
@ -19,7 +20,7 @@ struct MixEvalArgs : virtual Args, virtual MixRepair
|
||||||
|
|
||||||
Bindings * getAutoArgs(EvalState & state);
|
Bindings * getAutoArgs(EvalState & state);
|
||||||
|
|
||||||
Strings searchPath;
|
SearchPath searchPath;
|
||||||
|
|
||||||
std::optional<std::string> evalStoreUrl;
|
std::optional<std::string> evalStoreUrl;
|
||||||
|
|
||||||
|
|
|
@ -68,7 +68,7 @@ struct NixRepl
|
||||||
|
|
||||||
const Path historyFile;
|
const Path historyFile;
|
||||||
|
|
||||||
NixRepl(const Strings & searchPath, nix::ref<Store> store,ref<EvalState> state,
|
NixRepl(const SearchPath & searchPath, nix::ref<Store> store,ref<EvalState> state,
|
||||||
std::function<AnnotatedValues()> getValues);
|
std::function<AnnotatedValues()> getValues);
|
||||||
virtual ~NixRepl();
|
virtual ~NixRepl();
|
||||||
|
|
||||||
|
@ -104,7 +104,7 @@ std::string removeWhitespace(std::string s)
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
NixRepl::NixRepl(const Strings & searchPath, nix::ref<Store> store, ref<EvalState> state,
|
NixRepl::NixRepl(const SearchPath & searchPath, nix::ref<Store> store, ref<EvalState> state,
|
||||||
std::function<NixRepl::AnnotatedValues()> getValues)
|
std::function<NixRepl::AnnotatedValues()> getValues)
|
||||||
: AbstractNixRepl(state)
|
: AbstractNixRepl(state)
|
||||||
, debugTraceIndex(0)
|
, debugTraceIndex(0)
|
||||||
|
@ -1024,7 +1024,7 @@ std::ostream & NixRepl::printValue(std::ostream & str, Value & v, unsigned int m
|
||||||
|
|
||||||
|
|
||||||
std::unique_ptr<AbstractNixRepl> AbstractNixRepl::create(
|
std::unique_ptr<AbstractNixRepl> AbstractNixRepl::create(
|
||||||
const Strings & searchPath, nix::ref<Store> store, ref<EvalState> state,
|
const SearchPath & searchPath, nix::ref<Store> store, ref<EvalState> state,
|
||||||
std::function<AnnotatedValues()> getValues)
|
std::function<AnnotatedValues()> getValues)
|
||||||
{
|
{
|
||||||
return std::make_unique<NixRepl>(
|
return std::make_unique<NixRepl>(
|
||||||
|
@ -1044,7 +1044,7 @@ void AbstractNixRepl::runSimple(
|
||||||
NixRepl::AnnotatedValues values;
|
NixRepl::AnnotatedValues values;
|
||||||
return values;
|
return values;
|
||||||
};
|
};
|
||||||
const Strings & searchPath = {};
|
SearchPath searchPath = {};
|
||||||
auto repl = std::make_unique<NixRepl>(
|
auto repl = std::make_unique<NixRepl>(
|
||||||
searchPath,
|
searchPath,
|
||||||
openStore(),
|
openStore(),
|
||||||
|
|
|
@ -25,7 +25,7 @@ struct AbstractNixRepl
|
||||||
typedef std::vector<std::pair<Value*,std::string>> AnnotatedValues;
|
typedef std::vector<std::pair<Value*,std::string>> AnnotatedValues;
|
||||||
|
|
||||||
static std::unique_ptr<AbstractNixRepl> create(
|
static std::unique_ptr<AbstractNixRepl> create(
|
||||||
const Strings & searchPath, nix::ref<Store> store, ref<EvalState> state,
|
const SearchPath & searchPath, nix::ref<Store> store, ref<EvalState> state,
|
||||||
std::function<AnnotatedValues()> getValues);
|
std::function<AnnotatedValues()> getValues);
|
||||||
|
|
||||||
static void runSimple(
|
static void runSimple(
|
||||||
|
|
|
@ -498,7 +498,7 @@ ErrorBuilder & ErrorBuilder::withFrame(const Env & env, const Expr & expr)
|
||||||
|
|
||||||
|
|
||||||
EvalState::EvalState(
|
EvalState::EvalState(
|
||||||
const Strings & _searchPath,
|
const SearchPath & _searchPath,
|
||||||
ref<Store> store,
|
ref<Store> store,
|
||||||
std::shared_ptr<Store> buildStore)
|
std::shared_ptr<Store> buildStore)
|
||||||
: sWith(symbols.create("<with>"))
|
: sWith(symbols.create("<with>"))
|
||||||
|
@ -563,30 +563,32 @@ EvalState::EvalState(
|
||||||
|
|
||||||
/* Initialise the Nix expression search path. */
|
/* Initialise the Nix expression search path. */
|
||||||
if (!evalSettings.pureEval) {
|
if (!evalSettings.pureEval) {
|
||||||
for (auto & i : _searchPath) addToSearchPath(i);
|
for (auto & i : _searchPath.elements)
|
||||||
for (auto & i : evalSettings.nixPath.get()) addToSearchPath(i);
|
addToSearchPath(SearchPath::Elem {i});
|
||||||
|
for (auto & i : evalSettings.nixPath.get())
|
||||||
|
addToSearchPath(SearchPath::Elem::parse(i));
|
||||||
}
|
}
|
||||||
|
|
||||||
if (evalSettings.restrictEval || evalSettings.pureEval) {
|
if (evalSettings.restrictEval || evalSettings.pureEval) {
|
||||||
allowedPaths = PathSet();
|
allowedPaths = PathSet();
|
||||||
|
|
||||||
for (auto & i : searchPath) {
|
for (auto & i : searchPath.elements) {
|
||||||
auto r = resolveSearchPathElem(i);
|
auto r = resolveSearchPathPath(i.path);
|
||||||
if (!r.first) continue;
|
if (!r) continue;
|
||||||
|
|
||||||
auto path = r.second;
|
auto path = *std::move(r);
|
||||||
|
|
||||||
if (store->isInStore(r.second)) {
|
if (store->isInStore(path)) {
|
||||||
try {
|
try {
|
||||||
StorePathSet closure;
|
StorePathSet closure;
|
||||||
store->computeFSClosure(store->toStorePath(r.second).first, closure);
|
store->computeFSClosure(store->toStorePath(path).first, closure);
|
||||||
for (auto & path : closure)
|
for (auto & path : closure)
|
||||||
allowPath(path);
|
allowPath(path);
|
||||||
} catch (InvalidPath &) {
|
} catch (InvalidPath &) {
|
||||||
allowPath(r.second);
|
allowPath(path);
|
||||||
}
|
}
|
||||||
} else
|
} else
|
||||||
allowPath(r.second);
|
allowPath(path);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
|
@ -9,6 +9,7 @@
|
||||||
#include "config.hh"
|
#include "config.hh"
|
||||||
#include "experimental-features.hh"
|
#include "experimental-features.hh"
|
||||||
#include "input-accessor.hh"
|
#include "input-accessor.hh"
|
||||||
|
#include "search-path.hh"
|
||||||
|
|
||||||
#include <map>
|
#include <map>
|
||||||
#include <optional>
|
#include <optional>
|
||||||
|
@ -122,15 +123,6 @@ std::string printValue(const EvalState & state, const Value & v);
|
||||||
std::ostream & operator << (std::ostream & os, const ValueType t);
|
std::ostream & operator << (std::ostream & os, const ValueType t);
|
||||||
|
|
||||||
|
|
||||||
struct SearchPathElem
|
|
||||||
{
|
|
||||||
std::string prefix;
|
|
||||||
// FIXME: maybe change this to an std::variant<SourcePath, URL>.
|
|
||||||
std::string path;
|
|
||||||
};
|
|
||||||
typedef std::list<SearchPathElem> SearchPath;
|
|
||||||
|
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Initialise the Boehm GC, if applicable.
|
* Initialise the Boehm GC, if applicable.
|
||||||
*/
|
*/
|
||||||
|
@ -317,7 +309,7 @@ private:
|
||||||
|
|
||||||
SearchPath searchPath;
|
SearchPath searchPath;
|
||||||
|
|
||||||
std::map<std::string, std::pair<bool, std::string>> searchPathResolved;
|
std::map<std::string, std::optional<std::string>> searchPathResolved;
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Cache used by checkSourcePath().
|
* Cache used by checkSourcePath().
|
||||||
|
@ -344,12 +336,12 @@ private:
|
||||||
public:
|
public:
|
||||||
|
|
||||||
EvalState(
|
EvalState(
|
||||||
const Strings & _searchPath,
|
const SearchPath & _searchPath,
|
||||||
ref<Store> store,
|
ref<Store> store,
|
||||||
std::shared_ptr<Store> buildStore = nullptr);
|
std::shared_ptr<Store> buildStore = nullptr);
|
||||||
~EvalState();
|
~EvalState();
|
||||||
|
|
||||||
void addToSearchPath(const std::string & s);
|
void addToSearchPath(SearchPath::Elem && elem);
|
||||||
|
|
||||||
SearchPath getSearchPath() { return searchPath; }
|
SearchPath getSearchPath() { return searchPath; }
|
||||||
|
|
||||||
|
@ -431,12 +423,16 @@ public:
|
||||||
* Look up a file in the search path.
|
* Look up a file in the search path.
|
||||||
*/
|
*/
|
||||||
SourcePath findFile(const std::string_view path);
|
SourcePath findFile(const std::string_view path);
|
||||||
SourcePath findFile(SearchPath & searchPath, const std::string_view path, const PosIdx pos = noPos);
|
SourcePath findFile(const SearchPath & searchPath, const std::string_view path, const PosIdx pos = noPos);
|
||||||
|
|
||||||
/**
|
/**
|
||||||
|
* Try to resolve a search path value (not the optinal key part)
|
||||||
|
*
|
||||||
* If the specified search path element is a URI, download it.
|
* If the specified search path element is a URI, download it.
|
||||||
|
*
|
||||||
|
* If it is not found, return `std::nullopt`
|
||||||
*/
|
*/
|
||||||
std::pair<bool, std::string> resolveSearchPathElem(const SearchPathElem & elem);
|
std::optional<std::string> resolveSearchPathPath(const SearchPath::Path & path);
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Evaluate an expression to normal form
|
* Evaluate an expression to normal form
|
||||||
|
|
|
@ -663,7 +663,7 @@ Expr * EvalState::parse(
|
||||||
ParseData data {
|
ParseData data {
|
||||||
.state = *this,
|
.state = *this,
|
||||||
.symbols = symbols,
|
.symbols = symbols,
|
||||||
.basePath = std::move(basePath),
|
.basePath = basePath,
|
||||||
.origin = {origin},
|
.origin = {origin},
|
||||||
};
|
};
|
||||||
|
|
||||||
|
@ -734,22 +734,9 @@ Expr * EvalState::parseStdin()
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
void EvalState::addToSearchPath(const std::string & s)
|
void EvalState::addToSearchPath(SearchPath::Elem && elem)
|
||||||
{
|
{
|
||||||
size_t pos = s.find('=');
|
searchPath.elements.emplace_back(std::move(elem));
|
||||||
std::string prefix;
|
|
||||||
Path path;
|
|
||||||
if (pos == std::string::npos) {
|
|
||||||
path = s;
|
|
||||||
} else {
|
|
||||||
prefix = std::string(s, 0, pos);
|
|
||||||
path = std::string(s, pos + 1);
|
|
||||||
}
|
|
||||||
|
|
||||||
searchPath.emplace_back(SearchPathElem {
|
|
||||||
.prefix = prefix,
|
|
||||||
.path = path,
|
|
||||||
});
|
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@ -759,22 +746,19 @@ SourcePath EvalState::findFile(const std::string_view path)
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
SourcePath EvalState::findFile(SearchPath & searchPath, const std::string_view path, const PosIdx pos)
|
SourcePath EvalState::findFile(const SearchPath & searchPath, const std::string_view path, const PosIdx pos)
|
||||||
{
|
{
|
||||||
for (auto & i : searchPath) {
|
for (auto & i : searchPath.elements) {
|
||||||
std::string suffix;
|
auto suffixOpt = i.prefix.suffixIfPotentialMatch(path);
|
||||||
if (i.prefix.empty())
|
|
||||||
suffix = concatStrings("/", path);
|
if (!suffixOpt) continue;
|
||||||
else {
|
auto suffix = *suffixOpt;
|
||||||
auto s = i.prefix.size();
|
|
||||||
if (path.compare(0, s, i.prefix) != 0 ||
|
auto rOpt = resolveSearchPathPath(i.path);
|
||||||
(path.size() > s && path[s] != '/'))
|
if (!rOpt) continue;
|
||||||
continue;
|
auto r = *rOpt;
|
||||||
suffix = path.size() == s ? "" : concatStrings("/", path.substr(s));
|
|
||||||
}
|
Path res = suffix == "" ? r : concatStrings(r, "/", suffix);
|
||||||
auto r = resolveSearchPathElem(i);
|
|
||||||
if (!r.first) continue;
|
|
||||||
Path res = r.second + suffix;
|
|
||||||
if (pathExists(res)) return CanonPath(canonPath(res));
|
if (pathExists(res)) return CanonPath(canonPath(res));
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -791,49 +775,53 @@ SourcePath EvalState::findFile(SearchPath & searchPath, const std::string_view p
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
std::pair<bool, std::string> EvalState::resolveSearchPathElem(const SearchPathElem & elem)
|
std::optional<std::string> EvalState::resolveSearchPathPath(const SearchPath::Path & value0)
|
||||||
{
|
{
|
||||||
auto i = searchPathResolved.find(elem.path);
|
auto & value = value0.s;
|
||||||
|
auto i = searchPathResolved.find(value);
|
||||||
if (i != searchPathResolved.end()) return i->second;
|
if (i != searchPathResolved.end()) return i->second;
|
||||||
|
|
||||||
std::pair<bool, std::string> res;
|
std::optional<std::string> res;
|
||||||
|
|
||||||
if (EvalSettings::isPseudoUrl(elem.path)) {
|
if (EvalSettings::isPseudoUrl(value)) {
|
||||||
try {
|
try {
|
||||||
auto storePath = fetchers::downloadTarball(
|
auto storePath = fetchers::downloadTarball(
|
||||||
store, EvalSettings::resolvePseudoUrl(elem.path), "source", false).tree.storePath;
|
store, EvalSettings::resolvePseudoUrl(value), "source", false).tree.storePath;
|
||||||
res = { true, store->toRealPath(storePath) };
|
res = { store->toRealPath(storePath) };
|
||||||
} catch (FileTransferError & e) {
|
} catch (FileTransferError & e) {
|
||||||
logWarning({
|
logWarning({
|
||||||
.msg = hintfmt("Nix search path entry '%1%' cannot be downloaded, ignoring", elem.path)
|
.msg = hintfmt("Nix search path entry '%1%' cannot be downloaded, ignoring", value)
|
||||||
});
|
});
|
||||||
res = { false, "" };
|
res = std::nullopt;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
else if (hasPrefix(elem.path, "flake:")) {
|
else if (hasPrefix(value, "flake:")) {
|
||||||
experimentalFeatureSettings.require(Xp::Flakes);
|
experimentalFeatureSettings.require(Xp::Flakes);
|
||||||
auto flakeRef = parseFlakeRef(elem.path.substr(6), {}, true, false);
|
auto flakeRef = parseFlakeRef(value.substr(6), {}, true, false);
|
||||||
debug("fetching flake search path element '%s''", elem.path);
|
debug("fetching flake search path element '%s''", value);
|
||||||
auto storePath = flakeRef.resolve(store).fetchTree(store).first.storePath;
|
auto storePath = flakeRef.resolve(store).fetchTree(store).first.storePath;
|
||||||
res = { true, store->toRealPath(storePath) };
|
res = { store->toRealPath(storePath) };
|
||||||
}
|
}
|
||||||
|
|
||||||
else {
|
else {
|
||||||
auto path = absPath(elem.path);
|
auto path = absPath(value);
|
||||||
if (pathExists(path))
|
if (pathExists(path))
|
||||||
res = { true, path };
|
res = { path };
|
||||||
else {
|
else {
|
||||||
logWarning({
|
logWarning({
|
||||||
.msg = hintfmt("Nix search path entry '%1%' does not exist, ignoring", elem.path)
|
.msg = hintfmt("Nix search path entry '%1%' does not exist, ignoring", value)
|
||||||
});
|
});
|
||||||
res = { false, "" };
|
res = std::nullopt;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
debug("resolved search path element '%s' to '%s'", elem.path, res.second);
|
if (res)
|
||||||
|
debug("resolved search path element '%s' to '%s'", value, *res);
|
||||||
|
else
|
||||||
|
debug("failed to resolve search path element '%s'", value);
|
||||||
|
|
||||||
searchPathResolved[elem.path] = res;
|
searchPathResolved[value] = res;
|
||||||
return res;
|
return res;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
|
@ -1502,6 +1502,8 @@ static RegisterPrimOp primop_storePath({
|
||||||
in a new path (e.g. `/nix/store/ld01dnzc…-source-source`).
|
in a new path (e.g. `/nix/store/ld01dnzc…-source-source`).
|
||||||
|
|
||||||
Not available in [pure evaluation mode](@docroot@/command-ref/conf-file.md#conf-pure-eval).
|
Not available in [pure evaluation mode](@docroot@/command-ref/conf-file.md#conf-pure-eval).
|
||||||
|
|
||||||
|
See also [`builtins.fetchClosure`](#builtins-fetchClosure).
|
||||||
)",
|
)",
|
||||||
.fun = prim_storePath,
|
.fun = prim_storePath,
|
||||||
});
|
});
|
||||||
|
@ -1656,9 +1658,9 @@ static void prim_findFile(EvalState & state, const PosIdx pos, Value * * args, V
|
||||||
}));
|
}));
|
||||||
}
|
}
|
||||||
|
|
||||||
searchPath.emplace_back(SearchPathElem {
|
searchPath.elements.emplace_back(SearchPath::Elem {
|
||||||
.prefix = prefix,
|
.prefix = SearchPath::Prefix { .s = prefix },
|
||||||
.path = path,
|
.path = SearchPath::Path { .s = path },
|
||||||
});
|
});
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -4319,12 +4321,12 @@ void EvalState::createBaseEnv()
|
||||||
});
|
});
|
||||||
|
|
||||||
/* Add a value containing the current Nix expression search path. */
|
/* Add a value containing the current Nix expression search path. */
|
||||||
mkList(v, searchPath.size());
|
mkList(v, searchPath.elements.size());
|
||||||
int n = 0;
|
int n = 0;
|
||||||
for (auto & i : searchPath) {
|
for (auto & i : searchPath.elements) {
|
||||||
auto attrs = buildBindings(2);
|
auto attrs = buildBindings(2);
|
||||||
attrs.alloc("path").mkString(i.path);
|
attrs.alloc("path").mkString(i.path.s);
|
||||||
attrs.alloc("prefix").mkString(i.prefix);
|
attrs.alloc("prefix").mkString(i.prefix.s);
|
||||||
(v.listElems()[n++] = allocValue())->mkAttrs(attrs);
|
(v.listElems()[n++] = allocValue())->mkAttrs(attrs);
|
||||||
}
|
}
|
||||||
addConstant("__nixPath", v, {
|
addConstant("__nixPath", v, {
|
||||||
|
|
|
@ -5,37 +5,150 @@
|
||||||
|
|
||||||
namespace nix {
|
namespace nix {
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Handler for the content addressed case.
|
||||||
|
*
|
||||||
|
* @param state Evaluator state and store to write to.
|
||||||
|
* @param fromStore Store containing the path to rewrite.
|
||||||
|
* @param fromPath Source path to be rewritten.
|
||||||
|
* @param toPathMaybe Path to write the rewritten path to. If empty, the error shows the actual path.
|
||||||
|
* @param v Return `Value`
|
||||||
|
*/
|
||||||
|
static void runFetchClosureWithRewrite(EvalState & state, const PosIdx pos, Store & fromStore, const StorePath & fromPath, const std::optional<StorePath> & toPathMaybe, Value &v) {
|
||||||
|
|
||||||
|
// establish toPath or throw
|
||||||
|
|
||||||
|
if (!toPathMaybe || !state.store->isValidPath(*toPathMaybe)) {
|
||||||
|
auto rewrittenPath = makeContentAddressed(fromStore, *state.store, fromPath);
|
||||||
|
if (toPathMaybe && *toPathMaybe != rewrittenPath)
|
||||||
|
throw Error({
|
||||||
|
.msg = hintfmt("rewriting '%s' to content-addressed form yielded '%s', while '%s' was expected",
|
||||||
|
state.store->printStorePath(fromPath),
|
||||||
|
state.store->printStorePath(rewrittenPath),
|
||||||
|
state.store->printStorePath(*toPathMaybe)),
|
||||||
|
.errPos = state.positions[pos]
|
||||||
|
});
|
||||||
|
if (!toPathMaybe)
|
||||||
|
throw Error({
|
||||||
|
.msg = hintfmt(
|
||||||
|
"rewriting '%s' to content-addressed form yielded '%s'\n"
|
||||||
|
"Use this value for the 'toPath' attribute passed to 'fetchClosure'",
|
||||||
|
state.store->printStorePath(fromPath),
|
||||||
|
state.store->printStorePath(rewrittenPath)),
|
||||||
|
.errPos = state.positions[pos]
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
auto toPath = *toPathMaybe;
|
||||||
|
|
||||||
|
// check and return
|
||||||
|
|
||||||
|
auto resultInfo = state.store->queryPathInfo(toPath);
|
||||||
|
|
||||||
|
if (!resultInfo->isContentAddressed(*state.store)) {
|
||||||
|
// We don't perform the rewriting when outPath already exists, as an optimisation.
|
||||||
|
// However, we can quickly detect a mistake if the toPath is input addressed.
|
||||||
|
throw Error({
|
||||||
|
.msg = hintfmt(
|
||||||
|
"The 'toPath' value '%s' is input-addressed, so it can't possibly be the result of rewriting to a content-addressed path.\n\n"
|
||||||
|
"Set 'toPath' to an empty string to make Nix report the correct content-addressed path.",
|
||||||
|
state.store->printStorePath(toPath)),
|
||||||
|
.errPos = state.positions[pos]
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
state.mkStorePathString(toPath, v);
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Fetch the closure and make sure it's content addressed.
|
||||||
|
*/
|
||||||
|
static void runFetchClosureWithContentAddressedPath(EvalState & state, const PosIdx pos, Store & fromStore, const StorePath & fromPath, Value & v) {
|
||||||
|
|
||||||
|
if (!state.store->isValidPath(fromPath))
|
||||||
|
copyClosure(fromStore, *state.store, RealisedPath::Set { fromPath });
|
||||||
|
|
||||||
|
auto info = state.store->queryPathInfo(fromPath);
|
||||||
|
|
||||||
|
if (!info->isContentAddressed(*state.store)) {
|
||||||
|
throw Error({
|
||||||
|
.msg = hintfmt(
|
||||||
|
"The 'fromPath' value '%s' is input-addressed, but 'inputAddressed' is set to 'false' (default).\n\n"
|
||||||
|
"If you do intend to fetch an input-addressed store path, add\n\n"
|
||||||
|
" inputAddressed = true;\n\n"
|
||||||
|
"to the 'fetchClosure' arguments.\n\n"
|
||||||
|
"Note that to ensure authenticity input-addressed store paths, users must configure a trusted binary cache public key on their systems. This is not needed for content-addressed paths.",
|
||||||
|
state.store->printStorePath(fromPath)),
|
||||||
|
.errPos = state.positions[pos]
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
state.mkStorePathString(fromPath, v);
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Fetch the closure and make sure it's input addressed.
|
||||||
|
*/
|
||||||
|
static void runFetchClosureWithInputAddressedPath(EvalState & state, const PosIdx pos, Store & fromStore, const StorePath & fromPath, Value & v) {
|
||||||
|
|
||||||
|
if (!state.store->isValidPath(fromPath))
|
||||||
|
copyClosure(fromStore, *state.store, RealisedPath::Set { fromPath });
|
||||||
|
|
||||||
|
auto info = state.store->queryPathInfo(fromPath);
|
||||||
|
|
||||||
|
if (info->isContentAddressed(*state.store)) {
|
||||||
|
throw Error({
|
||||||
|
.msg = hintfmt(
|
||||||
|
"The store object referred to by 'fromPath' at '%s' is not input-addressed, but 'inputAddressed' is set to 'true'.\n\n"
|
||||||
|
"Remove the 'inputAddressed' attribute (it defaults to 'false') to expect 'fromPath' to be content-addressed",
|
||||||
|
state.store->printStorePath(fromPath)),
|
||||||
|
.errPos = state.positions[pos]
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
state.mkStorePathString(fromPath, v);
|
||||||
|
}
|
||||||
|
|
||||||
|
typedef std::optional<StorePath> StorePathOrGap;
|
||||||
|
|
||||||
static void prim_fetchClosure(EvalState & state, const PosIdx pos, Value * * args, Value & v)
|
static void prim_fetchClosure(EvalState & state, const PosIdx pos, Value * * args, Value & v)
|
||||||
{
|
{
|
||||||
state.forceAttrs(*args[0], pos, "while evaluating the argument passed to builtins.fetchClosure");
|
state.forceAttrs(*args[0], pos, "while evaluating the argument passed to builtins.fetchClosure");
|
||||||
|
|
||||||
std::optional<std::string> fromStoreUrl;
|
std::optional<std::string> fromStoreUrl;
|
||||||
std::optional<StorePath> fromPath;
|
std::optional<StorePath> fromPath;
|
||||||
bool toCA = false;
|
std::optional<StorePathOrGap> toPath;
|
||||||
std::optional<StorePath> toPath;
|
std::optional<bool> inputAddressedMaybe;
|
||||||
|
|
||||||
for (auto & attr : *args[0]->attrs) {
|
for (auto & attr : *args[0]->attrs) {
|
||||||
const auto & attrName = state.symbols[attr.name];
|
const auto & attrName = state.symbols[attr.name];
|
||||||
|
auto attrHint = [&]() -> std::string {
|
||||||
|
return "while evaluating the '" + attrName + "' attribute passed to builtins.fetchClosure";
|
||||||
|
};
|
||||||
|
|
||||||
if (attrName == "fromPath") {
|
if (attrName == "fromPath") {
|
||||||
NixStringContext context;
|
NixStringContext context;
|
||||||
fromPath = state.coerceToStorePath(attr.pos, *attr.value, context,
|
fromPath = state.coerceToStorePath(attr.pos, *attr.value, context, attrHint());
|
||||||
"while evaluating the 'fromPath' attribute passed to builtins.fetchClosure");
|
|
||||||
}
|
}
|
||||||
|
|
||||||
else if (attrName == "toPath") {
|
else if (attrName == "toPath") {
|
||||||
state.forceValue(*attr.value, attr.pos);
|
state.forceValue(*attr.value, attr.pos);
|
||||||
toCA = true;
|
bool isEmptyString = attr.value->type() == nString && attr.value->string.s == std::string("");
|
||||||
if (attr.value->type() != nString || attr.value->string.s != std::string("")) {
|
if (isEmptyString) {
|
||||||
|
toPath = StorePathOrGap {};
|
||||||
|
}
|
||||||
|
else {
|
||||||
NixStringContext context;
|
NixStringContext context;
|
||||||
toPath = state.coerceToStorePath(attr.pos, *attr.value, context,
|
toPath = state.coerceToStorePath(attr.pos, *attr.value, context, attrHint());
|
||||||
"while evaluating the 'toPath' attribute passed to builtins.fetchClosure");
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
else if (attrName == "fromStore")
|
else if (attrName == "fromStore")
|
||||||
fromStoreUrl = state.forceStringNoCtx(*attr.value, attr.pos,
|
fromStoreUrl = state.forceStringNoCtx(*attr.value, attr.pos,
|
||||||
"while evaluating the 'fromStore' attribute passed to builtins.fetchClosure");
|
attrHint());
|
||||||
|
|
||||||
|
else if (attrName == "inputAddressed")
|
||||||
|
inputAddressedMaybe = state.forceBool(*attr.value, attr.pos, attrHint());
|
||||||
|
|
||||||
else
|
else
|
||||||
throw Error({
|
throw Error({
|
||||||
|
@ -50,6 +163,18 @@ static void prim_fetchClosure(EvalState & state, const PosIdx pos, Value * * arg
|
||||||
.errPos = state.positions[pos]
|
.errPos = state.positions[pos]
|
||||||
});
|
});
|
||||||
|
|
||||||
|
bool inputAddressed = inputAddressedMaybe.value_or(false);
|
||||||
|
|
||||||
|
if (inputAddressed) {
|
||||||
|
if (toPath)
|
||||||
|
throw Error({
|
||||||
|
.msg = hintfmt("attribute '%s' is set to true, but '%s' is also set. Please remove one of them",
|
||||||
|
"inputAddressed",
|
||||||
|
"toPath"),
|
||||||
|
.errPos = state.positions[pos]
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
if (!fromStoreUrl)
|
if (!fromStoreUrl)
|
||||||
throw Error({
|
throw Error({
|
||||||
.msg = hintfmt("attribute '%s' is missing in call to 'fetchClosure'", "fromStore"),
|
.msg = hintfmt("attribute '%s' is missing in call to 'fetchClosure'", "fromStore"),
|
||||||
|
@ -74,55 +199,40 @@ static void prim_fetchClosure(EvalState & state, const PosIdx pos, Value * * arg
|
||||||
|
|
||||||
auto fromStore = openStore(parsedURL.to_string());
|
auto fromStore = openStore(parsedURL.to_string());
|
||||||
|
|
||||||
if (toCA) {
|
if (toPath)
|
||||||
if (!toPath || !state.store->isValidPath(*toPath)) {
|
runFetchClosureWithRewrite(state, pos, *fromStore, *fromPath, *toPath, v);
|
||||||
auto remappings = makeContentAddressed(*fromStore, *state.store, { *fromPath });
|
else if (inputAddressed)
|
||||||
auto i = remappings.find(*fromPath);
|
runFetchClosureWithInputAddressedPath(state, pos, *fromStore, *fromPath, v);
|
||||||
assert(i != remappings.end());
|
else
|
||||||
if (toPath && *toPath != i->second)
|
runFetchClosureWithContentAddressedPath(state, pos, *fromStore, *fromPath, v);
|
||||||
throw Error({
|
|
||||||
.msg = hintfmt("rewriting '%s' to content-addressed form yielded '%s', while '%s' was expected",
|
|
||||||
state.store->printStorePath(*fromPath),
|
|
||||||
state.store->printStorePath(i->second),
|
|
||||||
state.store->printStorePath(*toPath)),
|
|
||||||
.errPos = state.positions[pos]
|
|
||||||
});
|
|
||||||
if (!toPath)
|
|
||||||
throw Error({
|
|
||||||
.msg = hintfmt(
|
|
||||||
"rewriting '%s' to content-addressed form yielded '%s'; "
|
|
||||||
"please set this in the 'toPath' attribute passed to 'fetchClosure'",
|
|
||||||
state.store->printStorePath(*fromPath),
|
|
||||||
state.store->printStorePath(i->second)),
|
|
||||||
.errPos = state.positions[pos]
|
|
||||||
});
|
|
||||||
}
|
|
||||||
} else {
|
|
||||||
if (!state.store->isValidPath(*fromPath))
|
|
||||||
copyClosure(*fromStore, *state.store, RealisedPath::Set { *fromPath });
|
|
||||||
toPath = fromPath;
|
|
||||||
}
|
|
||||||
|
|
||||||
/* In pure mode, require a CA path. */
|
|
||||||
if (evalSettings.pureEval) {
|
|
||||||
auto info = state.store->queryPathInfo(*toPath);
|
|
||||||
if (!info->isContentAddressed(*state.store))
|
|
||||||
throw Error({
|
|
||||||
.msg = hintfmt("in pure mode, 'fetchClosure' requires a content-addressed path, which '%s' isn't",
|
|
||||||
state.store->printStorePath(*toPath)),
|
|
||||||
.errPos = state.positions[pos]
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
state.mkStorePathString(*toPath, v);
|
|
||||||
}
|
}
|
||||||
|
|
||||||
static RegisterPrimOp primop_fetchClosure({
|
static RegisterPrimOp primop_fetchClosure({
|
||||||
.name = "__fetchClosure",
|
.name = "__fetchClosure",
|
||||||
.args = {"args"},
|
.args = {"args"},
|
||||||
.doc = R"(
|
.doc = R"(
|
||||||
Fetch a Nix store closure from a binary cache, rewriting it into
|
Fetch a store path [closure](@docroot@/glossary.md#gloss-closure) from a binary cache, and return the store path as a string with context.
|
||||||
content-addressed form. For example,
|
|
||||||
|
This function can be invoked in three ways, that we will discuss in order of preference.
|
||||||
|
|
||||||
|
**Fetch a content-addressed store path**
|
||||||
|
|
||||||
|
Example:
|
||||||
|
|
||||||
|
```nix
|
||||||
|
builtins.fetchClosure {
|
||||||
|
fromStore = "https://cache.nixos.org";
|
||||||
|
fromPath = /nix/store/ldbhlwhh39wha58rm61bkiiwm6j7211j-git-2.33.1;
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
This is the simplest invocation, and it does not require the user of the expression to configure [`trusted-public-keys`](@docroot@/command-ref/conf-file.md#conf-trusted-public-keys) to ensure their authenticity.
|
||||||
|
|
||||||
|
If your store path is [input addressed](@docroot@/glossary.md#gloss-input-addressed-store-object) instead of content addressed, consider the other two invocations.
|
||||||
|
|
||||||
|
**Fetch any store path and rewrite it to a fully content-addressed store path**
|
||||||
|
|
||||||
|
Example:
|
||||||
|
|
||||||
```nix
|
```nix
|
||||||
builtins.fetchClosure {
|
builtins.fetchClosure {
|
||||||
|
@ -132,28 +242,42 @@ static RegisterPrimOp primop_fetchClosure({
|
||||||
}
|
}
|
||||||
```
|
```
|
||||||
|
|
||||||
fetches `/nix/store/r2jd...` from the specified binary cache,
|
This example fetches `/nix/store/r2jd...` from the specified binary cache,
|
||||||
and rewrites it into the content-addressed store path
|
and rewrites it into the content-addressed store path
|
||||||
`/nix/store/ldbh...`.
|
`/nix/store/ldbh...`.
|
||||||
|
|
||||||
If `fromPath` is already content-addressed, or if you are
|
Like the previous example, no extra configuration or privileges are required.
|
||||||
allowing impure evaluation (`--impure`), then `toPath` may be
|
|
||||||
omitted.
|
|
||||||
|
|
||||||
To find out the correct value for `toPath` given a `fromPath`,
|
To find out the correct value for `toPath` given a `fromPath`,
|
||||||
you can use `nix store make-content-addressed`:
|
use [`nix store make-content-addressed`](@docroot@/command-ref/new-cli/nix3-store-make-content-addressed.md):
|
||||||
|
|
||||||
```console
|
```console
|
||||||
# nix store make-content-addressed --from https://cache.nixos.org /nix/store/r2jd6ygnmirm2g803mksqqjm4y39yi6i-git-2.33.1
|
# nix store make-content-addressed --from https://cache.nixos.org /nix/store/r2jd6ygnmirm2g803mksqqjm4y39yi6i-git-2.33.1
|
||||||
rewrote '/nix/store/r2jd6ygnmirm2g803mksqqjm4y39yi6i-git-2.33.1' to '/nix/store/ldbhlwhh39wha58rm61bkiiwm6j7211j-git-2.33.1'
|
rewrote '/nix/store/r2jd6ygnmirm2g803mksqqjm4y39yi6i-git-2.33.1' to '/nix/store/ldbhlwhh39wha58rm61bkiiwm6j7211j-git-2.33.1'
|
||||||
```
|
```
|
||||||
|
|
||||||
This function is similar to `builtins.storePath` in that it
|
Alternatively, set `toPath = ""` and find the correct `toPath` in the error message.
|
||||||
allows you to use a previously built store path in a Nix
|
|
||||||
expression. However, it is more reproducible because it requires
|
**Fetch an input-addressed store path as is**
|
||||||
specifying a binary cache from which the path can be fetched.
|
|
||||||
Also, requiring a content-addressed final store path avoids the
|
Example:
|
||||||
need for users to configure binary cache public keys.
|
|
||||||
|
```nix
|
||||||
|
builtins.fetchClosure {
|
||||||
|
fromStore = "https://cache.nixos.org";
|
||||||
|
fromPath = /nix/store/r2jd6ygnmirm2g803mksqqjm4y39yi6i-git-2.33.1;
|
||||||
|
inputAddressed = true;
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
It is possible to fetch an [input-addressed store path](@docroot@/glossary.md#gloss-input-addressed-store-object) and return it as is.
|
||||||
|
However, this is the least preferred way of invoking `fetchClosure`, because it requires that the input-addressed paths are trusted by the Nix configuration.
|
||||||
|
|
||||||
|
**`builtins.storePath`**
|
||||||
|
|
||||||
|
`fetchClosure` is similar to [`builtins.storePath`](#builtins-storePath) in that it allows you to use a previously built store path in a Nix expression.
|
||||||
|
However, `fetchClosure` is more reproducible because it specifies a binary cache from which the path can be fetched.
|
||||||
|
Also, using content-addressed store paths does not require users to configure [`trusted-public-keys`](@docroot@/command-ref/conf-file.md#conf-trusted-public-keys) to ensure their authenticity.
|
||||||
)",
|
)",
|
||||||
.fun = prim_fetchClosure,
|
.fun = prim_fetchClosure,
|
||||||
.experimentalFeature = Xp::FetchClosure,
|
.experimentalFeature = Xp::FetchClosure,
|
||||||
|
|
56
src/libexpr/search-path.cc
Normal file
56
src/libexpr/search-path.cc
Normal file
|
@ -0,0 +1,56 @@
|
||||||
|
#include "search-path.hh"
|
||||||
|
#include "util.hh"
|
||||||
|
|
||||||
|
namespace nix {
|
||||||
|
|
||||||
|
std::optional<std::string_view> SearchPath::Prefix::suffixIfPotentialMatch(
|
||||||
|
std::string_view path) const
|
||||||
|
{
|
||||||
|
auto n = s.size();
|
||||||
|
|
||||||
|
/* Non-empty prefix and suffix must be separated by a /, or the
|
||||||
|
prefix is not a valid path prefix. */
|
||||||
|
bool needSeparator = n > 0 && (path.size() - n) > 0;
|
||||||
|
|
||||||
|
if (needSeparator && path[n] != '/') {
|
||||||
|
return std::nullopt;
|
||||||
|
}
|
||||||
|
|
||||||
|
/* Prefix must be prefix of this path. */
|
||||||
|
if (path.compare(0, n, s) != 0) {
|
||||||
|
return std::nullopt;
|
||||||
|
}
|
||||||
|
|
||||||
|
/* Skip next path separator. */
|
||||||
|
return {
|
||||||
|
path.substr(needSeparator ? n + 1 : n)
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
SearchPath::Elem SearchPath::Elem::parse(std::string_view rawElem)
|
||||||
|
{
|
||||||
|
size_t pos = rawElem.find('=');
|
||||||
|
|
||||||
|
return SearchPath::Elem {
|
||||||
|
.prefix = Prefix {
|
||||||
|
.s = pos == std::string::npos
|
||||||
|
? std::string { "" }
|
||||||
|
: std::string { rawElem.substr(0, pos) },
|
||||||
|
},
|
||||||
|
.path = Path {
|
||||||
|
.s = std::string { rawElem.substr(pos + 1) },
|
||||||
|
},
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
SearchPath parseSearchPath(const Strings & rawElems)
|
||||||
|
{
|
||||||
|
SearchPath res;
|
||||||
|
for (auto & rawElem : rawElems)
|
||||||
|
res.elements.emplace_back(SearchPath::Elem::parse(rawElem));
|
||||||
|
return res;
|
||||||
|
}
|
||||||
|
|
||||||
|
}
|
108
src/libexpr/search-path.hh
Normal file
108
src/libexpr/search-path.hh
Normal file
|
@ -0,0 +1,108 @@
|
||||||
|
#pragma once
|
||||||
|
///@file
|
||||||
|
|
||||||
|
#include <optional>
|
||||||
|
|
||||||
|
#include "types.hh"
|
||||||
|
#include "comparator.hh"
|
||||||
|
|
||||||
|
namespace nix {
|
||||||
|
|
||||||
|
/**
|
||||||
|
* A "search path" is a list of ways look for something, used with
|
||||||
|
* `builtins.findFile` and `< >` lookup expressions.
|
||||||
|
*/
|
||||||
|
struct SearchPath
|
||||||
|
{
|
||||||
|
/**
|
||||||
|
* A single element of a `SearchPath`.
|
||||||
|
*
|
||||||
|
* Each element is tried in succession when looking up a path. The first
|
||||||
|
* element to completely match wins.
|
||||||
|
*/
|
||||||
|
struct Elem;
|
||||||
|
|
||||||
|
/**
|
||||||
|
* The first part of a `SearchPath::Elem` pair.
|
||||||
|
*
|
||||||
|
* Called a "prefix" because it takes the form of a prefix of a file
|
||||||
|
* path (first `n` path components). When looking up a path, to use
|
||||||
|
* a `SearchPath::Elem`, its `Prefix` must match the path.
|
||||||
|
*/
|
||||||
|
struct Prefix;
|
||||||
|
|
||||||
|
/**
|
||||||
|
* The second part of a `SearchPath::Elem` pair.
|
||||||
|
*
|
||||||
|
* It is either a path or a URL (with certain restrictions / extra
|
||||||
|
* structure).
|
||||||
|
*
|
||||||
|
* If the prefix of the path we are looking up matches, we then
|
||||||
|
* check if the rest of the path points to something that exists
|
||||||
|
* within the directory denoted by this. If so, the
|
||||||
|
* `SearchPath::Elem` as a whole matches, and that *something* being
|
||||||
|
* pointed to by the rest of the path we are looking up is the
|
||||||
|
* result.
|
||||||
|
*/
|
||||||
|
struct Path;
|
||||||
|
|
||||||
|
/**
|
||||||
|
* The list of search path elements. Each one is checked for a path
|
||||||
|
* when looking up. (The actual lookup entry point is in `EvalState`
|
||||||
|
* not in this class.)
|
||||||
|
*/
|
||||||
|
std::list<SearchPath::Elem> elements;
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Parse a string into a `SearchPath`
|
||||||
|
*/
|
||||||
|
static SearchPath parse(const Strings & rawElems);
|
||||||
|
};
|
||||||
|
|
||||||
|
struct SearchPath::Prefix
|
||||||
|
{
|
||||||
|
/**
|
||||||
|
* Underlying string
|
||||||
|
*
|
||||||
|
* @todo Should we normalize this when constructing a `SearchPath::Prefix`?
|
||||||
|
*/
|
||||||
|
std::string s;
|
||||||
|
|
||||||
|
GENERATE_CMP(SearchPath::Prefix, me->s);
|
||||||
|
|
||||||
|
/**
|
||||||
|
* If the path possibly matches this search path element, return the
|
||||||
|
* suffix that we should look for inside the resolved value of the
|
||||||
|
* element
|
||||||
|
* Note the double optionality in the name. While we might have a matching prefix, the suffix may not exist.
|
||||||
|
*/
|
||||||
|
std::optional<std::string_view> suffixIfPotentialMatch(std::string_view path) const;
|
||||||
|
};
|
||||||
|
|
||||||
|
struct SearchPath::Path
|
||||||
|
{
|
||||||
|
/**
|
||||||
|
* The location of a search path item, as a path or URL.
|
||||||
|
*
|
||||||
|
* @todo Maybe change this to `std::variant<SourcePath, URL>`.
|
||||||
|
*/
|
||||||
|
std::string s;
|
||||||
|
|
||||||
|
GENERATE_CMP(SearchPath::Path, me->s);
|
||||||
|
};
|
||||||
|
|
||||||
|
struct SearchPath::Elem
|
||||||
|
{
|
||||||
|
|
||||||
|
Prefix prefix;
|
||||||
|
Path path;
|
||||||
|
|
||||||
|
GENERATE_CMP(SearchPath::Elem, me->prefix, me->path);
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Parse a string into a `SearchPath::Elem`
|
||||||
|
*/
|
||||||
|
static SearchPath::Elem parse(std::string_view rawElem);
|
||||||
|
};
|
||||||
|
|
||||||
|
}
|
90
src/libexpr/tests/search-path.cc
Normal file
90
src/libexpr/tests/search-path.cc
Normal file
|
@ -0,0 +1,90 @@
|
||||||
|
#include <gtest/gtest.h>
|
||||||
|
#include <gmock/gmock.h>
|
||||||
|
|
||||||
|
#include "search-path.hh"
|
||||||
|
|
||||||
|
namespace nix {
|
||||||
|
|
||||||
|
TEST(SearchPathElem, parse_justPath) {
|
||||||
|
ASSERT_EQ(
|
||||||
|
SearchPath::Elem::parse("foo"),
|
||||||
|
(SearchPath::Elem {
|
||||||
|
.prefix = SearchPath::Prefix { .s = "" },
|
||||||
|
.path = SearchPath::Path { .s = "foo" },
|
||||||
|
}));
|
||||||
|
}
|
||||||
|
|
||||||
|
TEST(SearchPathElem, parse_emptyPrefix) {
|
||||||
|
ASSERT_EQ(
|
||||||
|
SearchPath::Elem::parse("=foo"),
|
||||||
|
(SearchPath::Elem {
|
||||||
|
.prefix = SearchPath::Prefix { .s = "" },
|
||||||
|
.path = SearchPath::Path { .s = "foo" },
|
||||||
|
}));
|
||||||
|
}
|
||||||
|
|
||||||
|
TEST(SearchPathElem, parse_oneEq) {
|
||||||
|
ASSERT_EQ(
|
||||||
|
SearchPath::Elem::parse("foo=bar"),
|
||||||
|
(SearchPath::Elem {
|
||||||
|
.prefix = SearchPath::Prefix { .s = "foo" },
|
||||||
|
.path = SearchPath::Path { .s = "bar" },
|
||||||
|
}));
|
||||||
|
}
|
||||||
|
|
||||||
|
TEST(SearchPathElem, parse_twoEqs) {
|
||||||
|
ASSERT_EQ(
|
||||||
|
SearchPath::Elem::parse("foo=bar=baz"),
|
||||||
|
(SearchPath::Elem {
|
||||||
|
.prefix = SearchPath::Prefix { .s = "foo" },
|
||||||
|
.path = SearchPath::Path { .s = "bar=baz" },
|
||||||
|
}));
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
TEST(SearchPathElem, suffixIfPotentialMatch_justPath) {
|
||||||
|
SearchPath::Prefix prefix { .s = "" };
|
||||||
|
ASSERT_EQ(prefix.suffixIfPotentialMatch("any/thing"), std::optional { "any/thing" });
|
||||||
|
}
|
||||||
|
|
||||||
|
TEST(SearchPathElem, suffixIfPotentialMatch_misleadingPrefix1) {
|
||||||
|
SearchPath::Prefix prefix { .s = "foo" };
|
||||||
|
ASSERT_EQ(prefix.suffixIfPotentialMatch("fooX"), std::nullopt);
|
||||||
|
}
|
||||||
|
|
||||||
|
TEST(SearchPathElem, suffixIfPotentialMatch_misleadingPrefix2) {
|
||||||
|
SearchPath::Prefix prefix { .s = "foo" };
|
||||||
|
ASSERT_EQ(prefix.suffixIfPotentialMatch("fooX/bar"), std::nullopt);
|
||||||
|
}
|
||||||
|
|
||||||
|
TEST(SearchPathElem, suffixIfPotentialMatch_partialPrefix) {
|
||||||
|
SearchPath::Prefix prefix { .s = "fooX" };
|
||||||
|
ASSERT_EQ(prefix.suffixIfPotentialMatch("foo"), std::nullopt);
|
||||||
|
}
|
||||||
|
|
||||||
|
TEST(SearchPathElem, suffixIfPotentialMatch_exactPrefix) {
|
||||||
|
SearchPath::Prefix prefix { .s = "foo" };
|
||||||
|
ASSERT_EQ(prefix.suffixIfPotentialMatch("foo"), std::optional { "" });
|
||||||
|
}
|
||||||
|
|
||||||
|
TEST(SearchPathElem, suffixIfPotentialMatch_multiKey) {
|
||||||
|
SearchPath::Prefix prefix { .s = "foo/bar" };
|
||||||
|
ASSERT_EQ(prefix.suffixIfPotentialMatch("foo/bar/baz"), std::optional { "baz" });
|
||||||
|
}
|
||||||
|
|
||||||
|
TEST(SearchPathElem, suffixIfPotentialMatch_trailingSlash) {
|
||||||
|
SearchPath::Prefix prefix { .s = "foo" };
|
||||||
|
ASSERT_EQ(prefix.suffixIfPotentialMatch("foo/"), std::optional { "" });
|
||||||
|
}
|
||||||
|
|
||||||
|
TEST(SearchPathElem, suffixIfPotentialMatch_trailingDoubleSlash) {
|
||||||
|
SearchPath::Prefix prefix { .s = "foo" };
|
||||||
|
ASSERT_EQ(prefix.suffixIfPotentialMatch("foo//"), std::optional { "/" });
|
||||||
|
}
|
||||||
|
|
||||||
|
TEST(SearchPathElem, suffixIfPotentialMatch_trailingPath) {
|
||||||
|
SearchPath::Prefix prefix { .s = "foo" };
|
||||||
|
ASSERT_EQ(prefix.suffixIfPotentialMatch("foo/bar/baz"), std::optional { "bar/baz" });
|
||||||
|
}
|
||||||
|
|
||||||
|
}
|
|
@ -909,9 +909,12 @@ void LocalDerivationGoal::startBuilder()
|
||||||
|
|
||||||
/* Drop additional groups here because we can't do it
|
/* Drop additional groups here because we can't do it
|
||||||
after we've created the new user namespace. */
|
after we've created the new user namespace. */
|
||||||
if (settings.dropSupplementaryGroups)
|
if (setgroups(0, 0) == -1) {
|
||||||
if (setgroups(0, 0) == -1)
|
if (errno != EPERM)
|
||||||
throw SysError("setgroups failed. Set the drop-supplementary-groups option to false to skip this step.");
|
throw SysError("setgroups failed");
|
||||||
|
if (settings.requireDropSupplementaryGroups)
|
||||||
|
throw Error("setgroups failed. Set the require-drop-supplementary-groups option to false to skip this step.");
|
||||||
|
}
|
||||||
|
|
||||||
ProcessOptions options;
|
ProcessOptions options;
|
||||||
options.cloneFlags = CLONE_NEWPID | CLONE_NEWNS | CLONE_NEWIPC | CLONE_NEWUTS | CLONE_PARENT | SIGCHLD;
|
options.cloneFlags = CLONE_NEWPID | CLONE_NEWNS | CLONE_NEWIPC | CLONE_NEWUTS | CLONE_PARENT | SIGCHLD;
|
||||||
|
|
|
@ -524,23 +524,22 @@ public:
|
||||||
Setting<bool> sandboxFallback{this, true, "sandbox-fallback",
|
Setting<bool> sandboxFallback{this, true, "sandbox-fallback",
|
||||||
"Whether to disable sandboxing when the kernel doesn't allow it."};
|
"Whether to disable sandboxing when the kernel doesn't allow it."};
|
||||||
|
|
||||||
Setting<bool> dropSupplementaryGroups{this, getuid() == 0, "drop-supplementary-groups",
|
Setting<bool> requireDropSupplementaryGroups{this, getuid() == 0, "require-drop-supplementary-groups",
|
||||||
R"(
|
R"(
|
||||||
Whether to drop supplementary groups when building with sandboxing.
|
Following the principle of least privilege,
|
||||||
This is normally a good idea if we are root and have the capability to
|
Nix will attempt to drop supplementary groups when building with sandboxing.
|
||||||
do so.
|
|
||||||
|
|
||||||
But if this "root" is mapped from a non-root user in a larger
|
However this can fail under some circumstances.
|
||||||
namespace, we won't be able drop additional groups; they will be
|
For example, if the user lacks the `CAP_SETGID` capability.
|
||||||
mapped to nogroup in the child namespace. There does not seem to be a
|
Search `setgroups(2)` for `EPERM` to find more detailed information on this.
|
||||||
workaround for this.
|
|
||||||
|
|
||||||
(But who can tell from reading user_namespaces(7)? See also https://lwn.net/Articles/621612/.)
|
If you encounter such a failure, setting this option to `false` will let you ignore it and continue.
|
||||||
|
But before doing so, you should consider the security implications carefully.
|
||||||
|
Not dropping supplementary groups means the build sandbox will be less restricted than intended.
|
||||||
|
|
||||||
TODO: It might be good to create a middle ground option that allows
|
This option defaults to `true` when the user is root
|
||||||
`setgroups` to fail if all additional groups are "nogroup" / the value
|
(since `root` usually has permissions to call setgroups)
|
||||||
of `/proc/sys/fs/overflowuid`. This would handle the common
|
and `false` otherwise.
|
||||||
nested-sandboxing case identified above.
|
|
||||||
)"};
|
)"};
|
||||||
|
|
||||||
#if __linux__
|
#if __linux__
|
||||||
|
|
|
@ -80,4 +80,15 @@ std::map<StorePath, StorePath> makeContentAddressed(
|
||||||
return remappings;
|
return remappings;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
StorePath makeContentAddressed(
|
||||||
|
Store & srcStore,
|
||||||
|
Store & dstStore,
|
||||||
|
const StorePath & fromPath)
|
||||||
|
{
|
||||||
|
auto remappings = makeContentAddressed(srcStore, dstStore, StorePathSet { fromPath });
|
||||||
|
auto i = remappings.find(fromPath);
|
||||||
|
assert(i != remappings.end());
|
||||||
|
return i->second;
|
||||||
|
}
|
||||||
|
|
||||||
}
|
}
|
||||||
|
|
|
@ -5,9 +5,20 @@
|
||||||
|
|
||||||
namespace nix {
|
namespace nix {
|
||||||
|
|
||||||
|
/** Rewrite a closure of store paths to be completely content addressed.
|
||||||
|
*/
|
||||||
std::map<StorePath, StorePath> makeContentAddressed(
|
std::map<StorePath, StorePath> makeContentAddressed(
|
||||||
Store & srcStore,
|
Store & srcStore,
|
||||||
Store & dstStore,
|
Store & dstStore,
|
||||||
const StorePathSet & storePaths);
|
const StorePathSet & rootPaths);
|
||||||
|
|
||||||
|
/** Rewrite a closure of a store path to be completely content addressed.
|
||||||
|
*
|
||||||
|
* This is a convenience function for the case where you only have one root path.
|
||||||
|
*/
|
||||||
|
StorePath makeContentAddressed(
|
||||||
|
Store & srcStore,
|
||||||
|
Store & dstStore,
|
||||||
|
const StorePath & rootPath);
|
||||||
|
|
||||||
}
|
}
|
||||||
|
|
|
@ -63,7 +63,7 @@ The following types of installable are supported by most commands:
|
||||||
- [Nix file](#nix-file), optionally qualified by an attribute path
|
- [Nix file](#nix-file), optionally qualified by an attribute path
|
||||||
- [Nix expression](#nix-expression), optionally qualified by an attribute path
|
- [Nix expression](#nix-expression), optionally qualified by an attribute path
|
||||||
|
|
||||||
For most commands, if no installable is specified, `.` as assumed.
|
For most commands, if no installable is specified, `.` is assumed.
|
||||||
That is, Nix will operate on the default flake output attribute of the flake in the current directory.
|
That is, Nix will operate on the default flake output attribute of the flake in the current directory.
|
||||||
|
|
||||||
### Flake output attribute
|
### Flake output attribute
|
||||||
|
|
|
@ -146,7 +146,7 @@ struct CmdUpgradeNix : MixDryRun, StoreCommand
|
||||||
auto req = FileTransferRequest(storePathsUrl);
|
auto req = FileTransferRequest(storePathsUrl);
|
||||||
auto res = getFileTransfer()->download(req);
|
auto res = getFileTransfer()->download(req);
|
||||||
|
|
||||||
auto state = std::make_unique<EvalState>(Strings(), store);
|
auto state = std::make_unique<EvalState>(SearchPath{}, store);
|
||||||
auto v = state->allocValue();
|
auto v = state->allocValue();
|
||||||
state->eval(state->parseExprFromString(res.data, state->rootPath(CanonPath("/no-such-path"))), *v);
|
state->eval(state->parseExprFromString(res.data, state->rootPath(CanonPath("/no-such-path"))), *v);
|
||||||
Bindings & bindings(*state->allocBindings(0));
|
Bindings & bindings(*state->allocBindings(0));
|
||||||
|
|
|
@ -1,6 +1,7 @@
|
||||||
requireSandboxSupport
|
requireSandboxSupport
|
||||||
[[ $busybox =~ busybox ]] || skipTest "no busybox"
|
[[ $busybox =~ busybox ]] || skipTest "no busybox"
|
||||||
|
|
||||||
|
# Avoid store dir being inside sandbox build-dir
|
||||||
unset NIX_STORE_DIR
|
unset NIX_STORE_DIR
|
||||||
unset NIX_STATE_DIR
|
unset NIX_STATE_DIR
|
||||||
|
|
||||||
|
|
|
@ -15,6 +15,9 @@ if test -n "$dot"; then
|
||||||
$dot < $TEST_ROOT/graph
|
$dot < $TEST_ROOT/graph
|
||||||
fi
|
fi
|
||||||
|
|
||||||
|
# Test GraphML graph generation
|
||||||
|
nix-store -q --graphml "$drvPath" > $TEST_ROOT/graphml
|
||||||
|
|
||||||
outPath=$(nix-store -rvv "$drvPath") || fail "build failed"
|
outPath=$(nix-store -rvv "$drvPath") || fail "build failed"
|
||||||
|
|
||||||
# Test Graphviz graph generation.
|
# Test Graphviz graph generation.
|
||||||
|
|
|
@ -33,20 +33,43 @@ clearStore
|
||||||
[ ! -e $nonCaPath ]
|
[ ! -e $nonCaPath ]
|
||||||
[ -e $caPath ]
|
[ -e $caPath ]
|
||||||
|
|
||||||
|
clearStore
|
||||||
|
|
||||||
|
# The daemon will reject input addressed paths unless configured to trust the
|
||||||
|
# cache key or the user. This behavior should be covered by another test, so we
|
||||||
|
# skip this part when using the daemon.
|
||||||
if [[ "$NIX_REMOTE" != "daemon" ]]; then
|
if [[ "$NIX_REMOTE" != "daemon" ]]; then
|
||||||
|
|
||||||
# In impure mode, we can use non-CA paths.
|
# If we want to return a non-CA path, we have to be explicit about it.
|
||||||
[[ $(nix eval --raw --no-require-sigs --impure --expr "
|
expectStderr 1 nix eval --raw --no-require-sigs --expr "
|
||||||
builtins.fetchClosure {
|
builtins.fetchClosure {
|
||||||
fromStore = \"file://$cacheDir\";
|
fromStore = \"file://$cacheDir\";
|
||||||
fromPath = $nonCaPath;
|
fromPath = $nonCaPath;
|
||||||
}
|
}
|
||||||
|
" | grepQuiet -E "The .fromPath. value .* is input-addressed, but .inputAddressed. is set to .false."
|
||||||
|
|
||||||
|
# TODO: Should the closure be rejected, despite single user mode?
|
||||||
|
# [ ! -e $nonCaPath ]
|
||||||
|
|
||||||
|
[ ! -e $caPath ]
|
||||||
|
|
||||||
|
# We can use non-CA paths when we ask explicitly.
|
||||||
|
[[ $(nix eval --raw --no-require-sigs --expr "
|
||||||
|
builtins.fetchClosure {
|
||||||
|
fromStore = \"file://$cacheDir\";
|
||||||
|
fromPath = $nonCaPath;
|
||||||
|
inputAddressed = true;
|
||||||
|
}
|
||||||
") = $nonCaPath ]]
|
") = $nonCaPath ]]
|
||||||
|
|
||||||
[ -e $nonCaPath ]
|
[ -e $nonCaPath ]
|
||||||
|
[ ! -e $caPath ]
|
||||||
|
|
||||||
|
|
||||||
fi
|
fi
|
||||||
|
|
||||||
|
[ ! -e $caPath ]
|
||||||
|
|
||||||
# 'toPath' set to empty string should fail but print the expected path.
|
# 'toPath' set to empty string should fail but print the expected path.
|
||||||
expectStderr 1 nix eval -v --json --expr "
|
expectStderr 1 nix eval -v --json --expr "
|
||||||
builtins.fetchClosure {
|
builtins.fetchClosure {
|
||||||
|
@ -59,6 +82,10 @@ expectStderr 1 nix eval -v --json --expr "
|
||||||
# If fromPath is CA, then toPath isn't needed.
|
# If fromPath is CA, then toPath isn't needed.
|
||||||
nix copy --to file://$cacheDir $caPath
|
nix copy --to file://$cacheDir $caPath
|
||||||
|
|
||||||
|
clearStore
|
||||||
|
|
||||||
|
[ ! -e $caPath ]
|
||||||
|
|
||||||
[[ $(nix eval -v --raw --expr "
|
[[ $(nix eval -v --raw --expr "
|
||||||
builtins.fetchClosure {
|
builtins.fetchClosure {
|
||||||
fromStore = \"file://$cacheDir\";
|
fromStore = \"file://$cacheDir\";
|
||||||
|
@ -66,6 +93,8 @@ nix copy --to file://$cacheDir $caPath
|
||||||
}
|
}
|
||||||
") = $caPath ]]
|
") = $caPath ]]
|
||||||
|
|
||||||
|
[ -e $caPath ]
|
||||||
|
|
||||||
# Check that URL query parameters aren't allowed.
|
# Check that URL query parameters aren't allowed.
|
||||||
clearStore
|
clearStore
|
||||||
narCache=$TEST_ROOT/nar-cache
|
narCache=$TEST_ROOT/nar-cache
|
||||||
|
@ -77,3 +106,45 @@ rm -rf $narCache
|
||||||
}
|
}
|
||||||
")
|
")
|
||||||
(! [ -e $narCache ])
|
(! [ -e $narCache ])
|
||||||
|
|
||||||
|
# If toPath is specified but wrong, we check it (only) when the path is missing.
|
||||||
|
clearStore
|
||||||
|
|
||||||
|
badPath=$(echo $caPath | sed -e 's!/store/................................-!/store/xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx-!')
|
||||||
|
|
||||||
|
[ ! -e $badPath ]
|
||||||
|
|
||||||
|
expectStderr 1 nix eval -v --raw --expr "
|
||||||
|
builtins.fetchClosure {
|
||||||
|
fromStore = \"file://$cacheDir\";
|
||||||
|
fromPath = $nonCaPath;
|
||||||
|
toPath = $badPath;
|
||||||
|
}
|
||||||
|
" | grep "error: rewriting.*$nonCaPath.*yielded.*$caPath.*while.*$badPath.*was expected"
|
||||||
|
|
||||||
|
[ ! -e $badPath ]
|
||||||
|
|
||||||
|
# We only check it when missing, as a performance optimization similar to what we do for fixed output derivations. So if it's already there, we don't check it.
|
||||||
|
# It would be nice for this to fail, but checking it would be too(?) slow.
|
||||||
|
[ -e $caPath ]
|
||||||
|
|
||||||
|
[[ $(nix eval -v --raw --expr "
|
||||||
|
builtins.fetchClosure {
|
||||||
|
fromStore = \"file://$cacheDir\";
|
||||||
|
fromPath = $badPath;
|
||||||
|
toPath = $caPath;
|
||||||
|
}
|
||||||
|
") = $caPath ]]
|
||||||
|
|
||||||
|
|
||||||
|
# However, if the output address is unexpected, we can report it
|
||||||
|
|
||||||
|
|
||||||
|
expectStderr 1 nix eval -v --raw --expr "
|
||||||
|
builtins.fetchClosure {
|
||||||
|
fromStore = \"file://$cacheDir\";
|
||||||
|
fromPath = $caPath;
|
||||||
|
inputAddressed = true;
|
||||||
|
}
|
||||||
|
" | grepQuiet 'error.*The store object referred to by.*fromPath.* at .* is not input-addressed, but .*inputAddressed.* is set to .*true.*'
|
||||||
|
|
||||||
|
|
86
tests/lang-test-infra.sh
Normal file
86
tests/lang-test-infra.sh
Normal file
|
@ -0,0 +1,86 @@
|
||||||
|
# Test the function for lang.sh
|
||||||
|
source common.sh
|
||||||
|
|
||||||
|
source lang/framework.sh
|
||||||
|
|
||||||
|
# We are testing this, so don't want outside world to affect us.
|
||||||
|
unset _NIX_TEST_ACCEPT
|
||||||
|
|
||||||
|
# We'll only modify this in subshells so we don't need to reset it.
|
||||||
|
badDiff=0
|
||||||
|
|
||||||
|
# matches non-empty
|
||||||
|
echo Hi! > "$TEST_ROOT/got"
|
||||||
|
cp "$TEST_ROOT/got" "$TEST_ROOT/expected"
|
||||||
|
(
|
||||||
|
diffAndAcceptInner test "$TEST_ROOT/got" "$TEST_ROOT/expected"
|
||||||
|
(( "$badDiff" == 0 ))
|
||||||
|
)
|
||||||
|
|
||||||
|
# matches empty, non-existant file is the same as empty file
|
||||||
|
echo -n > "$TEST_ROOT/got"
|
||||||
|
(
|
||||||
|
diffAndAcceptInner test "$TEST_ROOT/got" "$TEST_ROOT/does-not-exist"
|
||||||
|
(( "$badDiff" == 0 ))
|
||||||
|
)
|
||||||
|
|
||||||
|
# doesn't matches non-empty, non-existant file is the same as empty file
|
||||||
|
echo Hi! > "$TEST_ROOT/got"
|
||||||
|
(
|
||||||
|
diffAndAcceptInner test "$TEST_ROOT/got" "$TEST_ROOT/does-not-exist"
|
||||||
|
(( "$badDiff" == 1 ))
|
||||||
|
)
|
||||||
|
|
||||||
|
# doesn't match, `badDiff` set, file unchanged
|
||||||
|
echo Hi! > "$TEST_ROOT/got"
|
||||||
|
echo Bye! > "$TEST_ROOT/expected"
|
||||||
|
(
|
||||||
|
diffAndAcceptInner test "$TEST_ROOT/got" "$TEST_ROOT/expected"
|
||||||
|
(( "$badDiff" == 1 ))
|
||||||
|
)
|
||||||
|
[[ "$(echo Bye! )" == $(< "$TEST_ROOT/expected") ]]
|
||||||
|
|
||||||
|
# _NIX_TEST_ACCEPT=1 matches non-empty
|
||||||
|
echo Hi! > "$TEST_ROOT/got"
|
||||||
|
cp "$TEST_ROOT/got" "$TEST_ROOT/expected"
|
||||||
|
(
|
||||||
|
_NIX_TEST_ACCEPT=1 diffAndAcceptInner test "$TEST_ROOT/got" "$TEST_ROOT/expected"
|
||||||
|
(( "$badDiff" == 0 ))
|
||||||
|
)
|
||||||
|
|
||||||
|
# _NIX_TEST_ACCEPT doesn't match, `badDiff=1` set, file changed (was previously non-empty)
|
||||||
|
echo Hi! > "$TEST_ROOT/got"
|
||||||
|
echo Bye! > "$TEST_ROOT/expected"
|
||||||
|
(
|
||||||
|
_NIX_TEST_ACCEPT=1 diffAndAcceptInner test "$TEST_ROOT/got" "$TEST_ROOT/expected"
|
||||||
|
(( "$badDiff" == 1 ))
|
||||||
|
)
|
||||||
|
[[ "$(echo Hi! )" == $(< "$TEST_ROOT/expected") ]]
|
||||||
|
# second time succeeds
|
||||||
|
(
|
||||||
|
diffAndAcceptInner test "$TEST_ROOT/got" "$TEST_ROOT/expected"
|
||||||
|
(( "$badDiff" == 0 ))
|
||||||
|
)
|
||||||
|
|
||||||
|
# _NIX_TEST_ACCEPT matches empty, non-existant file not created
|
||||||
|
echo -n > "$TEST_ROOT/got"
|
||||||
|
(
|
||||||
|
_NIX_TEST_ACCEPT=1 diffAndAcceptInner test "$TEST_ROOT/got" "$TEST_ROOT/does-not-exists"
|
||||||
|
(( "$badDiff" == 0 ))
|
||||||
|
)
|
||||||
|
[[ ! -f "$TEST_ROOT/does-not-exist" ]]
|
||||||
|
|
||||||
|
# _NIX_TEST_ACCEPT doesn't match, output empty, file deleted
|
||||||
|
echo -n > "$TEST_ROOT/got"
|
||||||
|
echo Bye! > "$TEST_ROOT/expected"
|
||||||
|
badDiff=0
|
||||||
|
(
|
||||||
|
_NIX_TEST_ACCEPT=1 diffAndAcceptInner test "$TEST_ROOT/got" "$TEST_ROOT/expected"
|
||||||
|
(( "$badDiff" == 1 ))
|
||||||
|
)
|
||||||
|
[[ ! -f "$TEST_ROOT/expected" ]]
|
||||||
|
# second time succeeds
|
||||||
|
(
|
||||||
|
diffAndAcceptInner test "$TEST_ROOT/got" "$TEST_ROOT/expected"
|
||||||
|
(( "$badDiff" == 0 ))
|
||||||
|
)
|
124
tests/lang.sh
Normal file → Executable file
124
tests/lang.sh
Normal file → Executable file
|
@ -1,5 +1,17 @@
|
||||||
source common.sh
|
source common.sh
|
||||||
|
|
||||||
|
set -o pipefail
|
||||||
|
|
||||||
|
source lang/framework.sh
|
||||||
|
|
||||||
|
# specialize function a bit
|
||||||
|
function diffAndAccept() {
|
||||||
|
local -r testName="$1"
|
||||||
|
local -r got="lang/$testName.$2"
|
||||||
|
local -r expected="lang/$testName.$3"
|
||||||
|
diffAndAcceptInner "$testName" "$got" "$expected"
|
||||||
|
}
|
||||||
|
|
||||||
export TEST_VAR=foo # for eval-okay-getenv.nix
|
export TEST_VAR=foo # for eval-okay-getenv.nix
|
||||||
export NIX_REMOTE=dummy://
|
export NIX_REMOTE=dummy://
|
||||||
export NIX_STORE_DIR=/nix/store
|
export NIX_STORE_DIR=/nix/store
|
||||||
|
@ -20,63 +32,115 @@ nix-instantiate --eval -E 'let x = { repeating = x; tracing = builtins.trace x t
|
||||||
|
|
||||||
set +x
|
set +x
|
||||||
|
|
||||||
fail=0
|
badDiff=0
|
||||||
|
badExitCode=0
|
||||||
|
|
||||||
for i in lang/parse-fail-*.nix; do
|
for i in lang/parse-fail-*.nix; do
|
||||||
echo "parsing $i (should fail)";
|
echo "parsing $i (should fail)";
|
||||||
i=$(basename $i .nix)
|
i=$(basename "$i" .nix)
|
||||||
if ! expect 1 nix-instantiate --parse - < lang/$i.nix; then
|
if expectStderr 1 nix-instantiate --parse - < "lang/$i.nix" > "lang/$i.err"
|
||||||
|
then
|
||||||
|
diffAndAccept "$i" err err.exp
|
||||||
|
else
|
||||||
echo "FAIL: $i shouldn't parse"
|
echo "FAIL: $i shouldn't parse"
|
||||||
fail=1
|
badExitCode=1
|
||||||
fi
|
fi
|
||||||
done
|
done
|
||||||
|
|
||||||
for i in lang/parse-okay-*.nix; do
|
for i in lang/parse-okay-*.nix; do
|
||||||
echo "parsing $i (should succeed)";
|
echo "parsing $i (should succeed)";
|
||||||
i=$(basename $i .nix)
|
i=$(basename "$i" .nix)
|
||||||
if ! expect 0 nix-instantiate --parse - < lang/$i.nix > lang/$i.out; then
|
if
|
||||||
|
expect 0 nix-instantiate --parse - < "lang/$i.nix" \
|
||||||
|
1> "lang/$i.out" \
|
||||||
|
2> "lang/$i.err"
|
||||||
|
then
|
||||||
|
sed "s!$(pwd)!/pwd!g" "lang/$i.out" "lang/$i.err"
|
||||||
|
diffAndAccept "$i" out exp
|
||||||
|
diffAndAccept "$i" err err.exp
|
||||||
|
else
|
||||||
echo "FAIL: $i should parse"
|
echo "FAIL: $i should parse"
|
||||||
fail=1
|
badExitCode=1
|
||||||
fi
|
fi
|
||||||
done
|
done
|
||||||
|
|
||||||
for i in lang/eval-fail-*.nix; do
|
for i in lang/eval-fail-*.nix; do
|
||||||
echo "evaluating $i (should fail)";
|
echo "evaluating $i (should fail)";
|
||||||
i=$(basename $i .nix)
|
i=$(basename "$i" .nix)
|
||||||
if ! expect 1 nix-instantiate --eval lang/$i.nix; then
|
if
|
||||||
|
expectStderr 1 nix-instantiate --show-trace "lang/$i.nix" \
|
||||||
|
| sed "s!$(pwd)!/pwd!g" > "lang/$i.err"
|
||||||
|
then
|
||||||
|
diffAndAccept "$i" err err.exp
|
||||||
|
else
|
||||||
echo "FAIL: $i shouldn't evaluate"
|
echo "FAIL: $i shouldn't evaluate"
|
||||||
fail=1
|
badExitCode=1
|
||||||
fi
|
fi
|
||||||
done
|
done
|
||||||
|
|
||||||
for i in lang/eval-okay-*.nix; do
|
for i in lang/eval-okay-*.nix; do
|
||||||
echo "evaluating $i (should succeed)";
|
echo "evaluating $i (should succeed)";
|
||||||
i=$(basename $i .nix)
|
i=$(basename "$i" .nix)
|
||||||
|
|
||||||
if test -e lang/$i.exp; then
|
if test -e "lang/$i.exp.xml"; then
|
||||||
flags=
|
if expect 0 nix-instantiate --eval --xml --no-location --strict \
|
||||||
if test -e lang/$i.flags; then
|
"lang/$i.nix" > "lang/$i.out.xml"
|
||||||
flags=$(cat lang/$i.flags)
|
then
|
||||||
fi
|
diffAndAccept "$i" out.xml exp.xml
|
||||||
if ! expect 0 env NIX_PATH=lang/dir3:lang/dir4 HOME=/fake-home nix-instantiate $flags --eval --strict lang/$i.nix > lang/$i.out; then
|
else
|
||||||
echo "FAIL: $i should evaluate"
|
echo "FAIL: $i should evaluate"
|
||||||
fail=1
|
badExitCode=1
|
||||||
elif ! diff <(< lang/$i.out sed -e "s|$(pwd)|/pwd|g") lang/$i.exp; then
|
fi
|
||||||
echo "FAIL: evaluation result of $i not as expected"
|
elif test ! -e "lang/$i.exp-disabled"; then
|
||||||
fail=1
|
declare -a flags=()
|
||||||
|
if test -e "lang/$i.flags"; then
|
||||||
|
read -r -a flags < "lang/$i.flags"
|
||||||
fi
|
fi
|
||||||
fi
|
|
||||||
|
|
||||||
if test -e lang/$i.exp.xml; then
|
if
|
||||||
if ! expect 0 nix-instantiate --eval --xml --no-location --strict \
|
expect 0 env \
|
||||||
lang/$i.nix > lang/$i.out.xml; then
|
NIX_PATH=lang/dir3:lang/dir4 \
|
||||||
|
HOME=/fake-home \
|
||||||
|
nix-instantiate "${flags[@]}" --eval --strict "lang/$i.nix" \
|
||||||
|
1> "lang/$i.out" \
|
||||||
|
2> "lang/$i.err"
|
||||||
|
then
|
||||||
|
sed -i "s!$(pwd)!/pwd!g" "lang/$i.out" "lang/$i.err"
|
||||||
|
diffAndAccept "$i" out exp
|
||||||
|
diffAndAccept "$i" err err.exp
|
||||||
|
else
|
||||||
echo "FAIL: $i should evaluate"
|
echo "FAIL: $i should evaluate"
|
||||||
fail=1
|
badExitCode=1
|
||||||
elif ! cmp -s lang/$i.out.xml lang/$i.exp.xml; then
|
|
||||||
echo "FAIL: XML evaluation result of $i not as expected"
|
|
||||||
fail=1
|
|
||||||
fi
|
fi
|
||||||
fi
|
fi
|
||||||
done
|
done
|
||||||
|
|
||||||
exit $fail
|
if test -n "${_NIX_TEST_ACCEPT-}"; then
|
||||||
|
if (( "$badDiff" )); then
|
||||||
|
echo 'Output did mot match, but accepted output as the persisted expected output.'
|
||||||
|
echo 'That means the next time the tests are run, they should pass.'
|
||||||
|
else
|
||||||
|
echo 'NOTE: Environment variable _NIX_TEST_ACCEPT is defined,'
|
||||||
|
echo 'indicating the unexpected output should be accepted as the expected output going forward,'
|
||||||
|
echo 'but no tests had unexpected output so there was no expected output to update.'
|
||||||
|
fi
|
||||||
|
if (( "$badExitCode" )); then
|
||||||
|
exit "$badExitCode"
|
||||||
|
else
|
||||||
|
skipTest "regenerating golden masters"
|
||||||
|
fi
|
||||||
|
else
|
||||||
|
if (( "$badDiff" )); then
|
||||||
|
echo ''
|
||||||
|
echo 'You can rerun this test with:'
|
||||||
|
echo ''
|
||||||
|
echo ' _NIX_TEST_ACCEPT=1 make tests/lang.sh.test'
|
||||||
|
echo ''
|
||||||
|
echo 'to regenerate the files containing the expected output,'
|
||||||
|
echo 'and then view the git diff to decide whether a change is'
|
||||||
|
echo 'good/intentional or bad/unintentional.'
|
||||||
|
echo 'If the diff contains arbitrary or impure information,'
|
||||||
|
echo 'please improve the normalization that the test applies to the output.'
|
||||||
|
fi
|
||||||
|
exit $(( "$badExitCode" + "$badDiff" ))
|
||||||
|
fi
|
||||||
|
|
0
tests/lang/empty.exp
Normal file
0
tests/lang/empty.exp
Normal file
10
tests/lang/eval-fail-abort.err.exp
Normal file
10
tests/lang/eval-fail-abort.err.exp
Normal file
|
@ -0,0 +1,10 @@
|
||||||
|
error:
|
||||||
|
… while calling the 'abort' builtin
|
||||||
|
|
||||||
|
at /pwd/lang/eval-fail-abort.nix:1:14:
|
||||||
|
|
||||||
|
1| if true then abort "this should fail" else 1
|
||||||
|
| ^
|
||||||
|
2|
|
||||||
|
|
||||||
|
error: evaluation aborted with the following error message: 'this should fail'
|
1
tests/lang/eval-fail-antiquoted-path.err.exp
Normal file
1
tests/lang/eval-fail-antiquoted-path.err.exp
Normal file
|
@ -0,0 +1 @@
|
||||||
|
error: getting attributes of path ‘PWD/lang/fnord’: No such file or directory
|
36
tests/lang/eval-fail-assert.err.exp
Normal file
36
tests/lang/eval-fail-assert.err.exp
Normal file
|
@ -0,0 +1,36 @@
|
||||||
|
error:
|
||||||
|
… while evaluating the attribute 'body'
|
||||||
|
|
||||||
|
at /pwd/lang/eval-fail-assert.nix:4:3:
|
||||||
|
|
||||||
|
3|
|
||||||
|
4| body = x "x";
|
||||||
|
| ^
|
||||||
|
5| }
|
||||||
|
|
||||||
|
… from call site
|
||||||
|
|
||||||
|
at /pwd/lang/eval-fail-assert.nix:4:10:
|
||||||
|
|
||||||
|
3|
|
||||||
|
4| body = x "x";
|
||||||
|
| ^
|
||||||
|
5| }
|
||||||
|
|
||||||
|
… while calling 'x'
|
||||||
|
|
||||||
|
at /pwd/lang/eval-fail-assert.nix:2:7:
|
||||||
|
|
||||||
|
1| let {
|
||||||
|
2| x = arg: assert arg == "y"; 123;
|
||||||
|
| ^
|
||||||
|
3|
|
||||||
|
|
||||||
|
error: assertion '(arg == "y")' failed
|
||||||
|
|
||||||
|
at /pwd/lang/eval-fail-assert.nix:2:12:
|
||||||
|
|
||||||
|
1| let {
|
||||||
|
2| x = arg: assert arg == "y"; 123;
|
||||||
|
| ^
|
||||||
|
3|
|
10
tests/lang/eval-fail-bad-antiquote-1.err.exp
Normal file
10
tests/lang/eval-fail-bad-antiquote-1.err.exp
Normal file
|
@ -0,0 +1,10 @@
|
||||||
|
error:
|
||||||
|
… while evaluating a path segment
|
||||||
|
|
||||||
|
at /pwd/lang/eval-fail-bad-antiquote-1.nix:1:2:
|
||||||
|
|
||||||
|
1| "${x: x}"
|
||||||
|
| ^
|
||||||
|
2|
|
||||||
|
|
||||||
|
error: cannot coerce a function to a string
|
1
tests/lang/eval-fail-bad-antiquote-2.err.exp
Normal file
1
tests/lang/eval-fail-bad-antiquote-2.err.exp
Normal file
|
@ -0,0 +1 @@
|
||||||
|
error: operation 'addToStoreFromDump' is not supported by store 'dummy'
|
10
tests/lang/eval-fail-bad-antiquote-3.err.exp
Normal file
10
tests/lang/eval-fail-bad-antiquote-3.err.exp
Normal file
|
@ -0,0 +1,10 @@
|
||||||
|
error:
|
||||||
|
… while evaluating a path segment
|
||||||
|
|
||||||
|
at /pwd/lang/eval-fail-bad-antiquote-3.nix:1:3:
|
||||||
|
|
||||||
|
1| ''${x: x}''
|
||||||
|
| ^
|
||||||
|
2|
|
||||||
|
|
||||||
|
error: cannot coerce a function to a string
|
10
tests/lang/eval-fail-bad-string-interpolation-1.err.exp
Normal file
10
tests/lang/eval-fail-bad-string-interpolation-1.err.exp
Normal file
|
@ -0,0 +1,10 @@
|
||||||
|
error:
|
||||||
|
… while evaluating a path segment
|
||||||
|
|
||||||
|
at /pwd/lang/eval-fail-bad-string-interpolation-1.nix:1:2:
|
||||||
|
|
||||||
|
1| "${x: x}"
|
||||||
|
| ^
|
||||||
|
2|
|
||||||
|
|
||||||
|
error: cannot coerce a function to a string
|
1
tests/lang/eval-fail-bad-string-interpolation-2.err.exp
Normal file
1
tests/lang/eval-fail-bad-string-interpolation-2.err.exp
Normal file
|
@ -0,0 +1 @@
|
||||||
|
error: operation 'addToStoreFromDump' is not supported by store 'dummy'
|
10
tests/lang/eval-fail-bad-string-interpolation-3.err.exp
Normal file
10
tests/lang/eval-fail-bad-string-interpolation-3.err.exp
Normal file
|
@ -0,0 +1,10 @@
|
||||||
|
error:
|
||||||
|
… while evaluating a path segment
|
||||||
|
|
||||||
|
at /pwd/lang/eval-fail-bad-string-interpolation-3.nix:1:3:
|
||||||
|
|
||||||
|
1| ''${x: x}''
|
||||||
|
| ^
|
||||||
|
2|
|
||||||
|
|
||||||
|
error: cannot coerce a function to a string
|
18
tests/lang/eval-fail-blackhole.err.exp
Normal file
18
tests/lang/eval-fail-blackhole.err.exp
Normal file
|
@ -0,0 +1,18 @@
|
||||||
|
error:
|
||||||
|
… while evaluating the attribute 'body'
|
||||||
|
|
||||||
|
at /pwd/lang/eval-fail-blackhole.nix:2:3:
|
||||||
|
|
||||||
|
1| let {
|
||||||
|
2| body = x;
|
||||||
|
| ^
|
||||||
|
3| x = y;
|
||||||
|
|
||||||
|
error: infinite recursion encountered
|
||||||
|
|
||||||
|
at /pwd/lang/eval-fail-blackhole.nix:3:7:
|
||||||
|
|
||||||
|
2| body = x;
|
||||||
|
3| x = y;
|
||||||
|
| ^
|
||||||
|
4| y = x;
|
26
tests/lang/eval-fail-deepseq.err.exp
Normal file
26
tests/lang/eval-fail-deepseq.err.exp
Normal file
|
@ -0,0 +1,26 @@
|
||||||
|
error:
|
||||||
|
… while calling the 'deepSeq' builtin
|
||||||
|
|
||||||
|
at /pwd/lang/eval-fail-deepseq.nix:1:1:
|
||||||
|
|
||||||
|
1| builtins.deepSeq { x = abort "foo"; } 456
|
||||||
|
| ^
|
||||||
|
2|
|
||||||
|
|
||||||
|
… while evaluating the attribute 'x'
|
||||||
|
|
||||||
|
at /pwd/lang/eval-fail-deepseq.nix:1:20:
|
||||||
|
|
||||||
|
1| builtins.deepSeq { x = abort "foo"; } 456
|
||||||
|
| ^
|
||||||
|
2|
|
||||||
|
|
||||||
|
… while calling the 'abort' builtin
|
||||||
|
|
||||||
|
at /pwd/lang/eval-fail-deepseq.nix:1:24:
|
||||||
|
|
||||||
|
1| builtins.deepSeq { x = abort "foo"; } 456
|
||||||
|
| ^
|
||||||
|
2|
|
||||||
|
|
||||||
|
error: evaluation aborted with the following error message: 'foo'
|
|
@ -0,0 +1,38 @@
|
||||||
|
error:
|
||||||
|
… while calling the 'foldl'' builtin
|
||||||
|
|
||||||
|
at /pwd/lang/eval-fail-foldlStrict-strict-op-application.nix:2:1:
|
||||||
|
|
||||||
|
1| # Tests that the result of applying op is forced even if the value is never used
|
||||||
|
2| builtins.foldl'
|
||||||
|
| ^
|
||||||
|
3| (_: f: f null)
|
||||||
|
|
||||||
|
… while calling anonymous lambda
|
||||||
|
|
||||||
|
at /pwd/lang/eval-fail-foldlStrict-strict-op-application.nix:3:7:
|
||||||
|
|
||||||
|
2| builtins.foldl'
|
||||||
|
3| (_: f: f null)
|
||||||
|
| ^
|
||||||
|
4| null
|
||||||
|
|
||||||
|
… from call site
|
||||||
|
|
||||||
|
at /pwd/lang/eval-fail-foldlStrict-strict-op-application.nix:3:10:
|
||||||
|
|
||||||
|
2| builtins.foldl'
|
||||||
|
3| (_: f: f null)
|
||||||
|
| ^
|
||||||
|
4| null
|
||||||
|
|
||||||
|
… while calling anonymous lambda
|
||||||
|
|
||||||
|
at /pwd/lang/eval-fail-foldlStrict-strict-op-application.nix:5:6:
|
||||||
|
|
||||||
|
4| null
|
||||||
|
5| [ (_: throw "Not the final value, but is still forced!") (_: 23) ]
|
||||||
|
| ^
|
||||||
|
6|
|
||||||
|
|
||||||
|
error: Not the final value, but is still forced!
|
12
tests/lang/eval-fail-fromTOML-timestamps.err.exp
Normal file
12
tests/lang/eval-fail-fromTOML-timestamps.err.exp
Normal file
|
@ -0,0 +1,12 @@
|
||||||
|
error:
|
||||||
|
… while calling the 'fromTOML' builtin
|
||||||
|
|
||||||
|
at /pwd/lang/eval-fail-fromTOML-timestamps.nix:1:1:
|
||||||
|
|
||||||
|
1| builtins.fromTOML ''
|
||||||
|
| ^
|
||||||
|
2| key = "value"
|
||||||
|
|
||||||
|
error: while parsing a TOML string: Dates and times are not supported
|
||||||
|
|
||||||
|
at «none»:0: (source not available)
|
19
tests/lang/eval-fail-hashfile-missing.err.exp
Normal file
19
tests/lang/eval-fail-hashfile-missing.err.exp
Normal file
|
@ -0,0 +1,19 @@
|
||||||
|
error:
|
||||||
|
… while calling the 'toString' builtin
|
||||||
|
|
||||||
|
at /pwd/lang/eval-fail-hashfile-missing.nix:4:3:
|
||||||
|
|
||||||
|
3| in
|
||||||
|
4| toString (builtins.concatLists (map (hash: map (builtins.hashFile hash) paths) ["md5" "sha1" "sha256" "sha512"]))
|
||||||
|
| ^
|
||||||
|
5|
|
||||||
|
|
||||||
|
… while evaluating the first argument passed to builtins.toString
|
||||||
|
|
||||||
|
at «none»:0: (source not available)
|
||||||
|
|
||||||
|
… while calling the 'hashFile' builtin
|
||||||
|
|
||||||
|
at «none»:0: (source not available)
|
||||||
|
|
||||||
|
error: opening file '/pwd/lang/this-file-is-definitely-not-there-7392097': No such file or directory
|
10
tests/lang/eval-fail-list.err.exp
Normal file
10
tests/lang/eval-fail-list.err.exp
Normal file
|
@ -0,0 +1,10 @@
|
||||||
|
error:
|
||||||
|
… while evaluating one of the elements to concatenate
|
||||||
|
|
||||||
|
at /pwd/lang/eval-fail-list.nix:1:2:
|
||||||
|
|
||||||
|
1| 8++1
|
||||||
|
| ^
|
||||||
|
2|
|
||||||
|
|
||||||
|
error: value is an integer while a list was expected
|
1
tests/lang/eval-fail-list.nix
Normal file
1
tests/lang/eval-fail-list.nix
Normal file
|
@ -0,0 +1 @@
|
||||||
|
8++1
|
16
tests/lang/eval-fail-missing-arg.err.exp
Normal file
16
tests/lang/eval-fail-missing-arg.err.exp
Normal file
|
@ -0,0 +1,16 @@
|
||||||
|
error:
|
||||||
|
… from call site
|
||||||
|
|
||||||
|
at /pwd/lang/eval-fail-missing-arg.nix:1:1:
|
||||||
|
|
||||||
|
1| ({x, y, z}: x + y + z) {x = "foo"; z = "bar";}
|
||||||
|
| ^
|
||||||
|
2|
|
||||||
|
|
||||||
|
error: function 'anonymous lambda' called without required argument 'y'
|
||||||
|
|
||||||
|
at /pwd/lang/eval-fail-missing-arg.nix:1:2:
|
||||||
|
|
||||||
|
1| ({x, y, z}: x + y + z) {x = "foo"; z = "bar";}
|
||||||
|
| ^
|
||||||
|
2|
|
1
tests/lang/eval-fail-nonexist-path.err.exp
Normal file
1
tests/lang/eval-fail-nonexist-path.err.exp
Normal file
|
@ -0,0 +1 @@
|
||||||
|
error: operation 'addToStoreFromDump' is not supported by store 'dummy'
|
8
tests/lang/eval-fail-path-slash.err.exp
Normal file
8
tests/lang/eval-fail-path-slash.err.exp
Normal file
|
@ -0,0 +1,8 @@
|
||||||
|
error: path has a trailing slash
|
||||||
|
|
||||||
|
at /pwd/lang/eval-fail-path-slash.nix:6:12:
|
||||||
|
|
||||||
|
5| # and https://nixos.org/nix-dev/2016-June/020829.html
|
||||||
|
6| /nix/store/
|
||||||
|
| ^
|
||||||
|
7|
|
16
tests/lang/eval-fail-recursion.err.exp
Normal file
16
tests/lang/eval-fail-recursion.err.exp
Normal file
|
@ -0,0 +1,16 @@
|
||||||
|
error:
|
||||||
|
… in the right operand of the update (//) operator
|
||||||
|
|
||||||
|
at /pwd/lang/eval-fail-recursion.nix:1:12:
|
||||||
|
|
||||||
|
1| let a = {} // a; in a.foo
|
||||||
|
| ^
|
||||||
|
2|
|
||||||
|
|
||||||
|
error: infinite recursion encountered
|
||||||
|
|
||||||
|
at /pwd/lang/eval-fail-recursion.nix:1:15:
|
||||||
|
|
||||||
|
1| let a = {} // a; in a.foo
|
||||||
|
| ^
|
||||||
|
2|
|
1
tests/lang/eval-fail-recursion.nix
Normal file
1
tests/lang/eval-fail-recursion.nix
Normal file
|
@ -0,0 +1 @@
|
||||||
|
let a = {} // a; in a.foo
|
19
tests/lang/eval-fail-remove.err.exp
Normal file
19
tests/lang/eval-fail-remove.err.exp
Normal file
|
@ -0,0 +1,19 @@
|
||||||
|
error:
|
||||||
|
… while evaluating the attribute 'body'
|
||||||
|
|
||||||
|
at /pwd/lang/eval-fail-remove.nix:4:3:
|
||||||
|
|
||||||
|
3|
|
||||||
|
4| body = (removeAttrs attrs ["x"]).x;
|
||||||
|
| ^
|
||||||
|
5| }
|
||||||
|
|
||||||
|
error: attribute 'x' missing
|
||||||
|
|
||||||
|
at /pwd/lang/eval-fail-remove.nix:4:10:
|
||||||
|
|
||||||
|
3|
|
||||||
|
4| body = (removeAttrs attrs ["x"]).x;
|
||||||
|
| ^
|
||||||
|
5| }
|
||||||
|
Did you mean y?
|
36
tests/lang/eval-fail-scope-5.err.exp
Normal file
36
tests/lang/eval-fail-scope-5.err.exp
Normal file
|
@ -0,0 +1,36 @@
|
||||||
|
error:
|
||||||
|
… while evaluating the attribute 'body'
|
||||||
|
|
||||||
|
at /pwd/lang/eval-fail-scope-5.nix:8:3:
|
||||||
|
|
||||||
|
7|
|
||||||
|
8| body = f {};
|
||||||
|
| ^
|
||||||
|
9|
|
||||||
|
|
||||||
|
… from call site
|
||||||
|
|
||||||
|
at /pwd/lang/eval-fail-scope-5.nix:8:10:
|
||||||
|
|
||||||
|
7|
|
||||||
|
8| body = f {};
|
||||||
|
| ^
|
||||||
|
9|
|
||||||
|
|
||||||
|
… while calling 'f'
|
||||||
|
|
||||||
|
at /pwd/lang/eval-fail-scope-5.nix:6:7:
|
||||||
|
|
||||||
|
5|
|
||||||
|
6| f = {x ? y, y ? x}: x + y;
|
||||||
|
| ^
|
||||||
|
7|
|
||||||
|
|
||||||
|
error: infinite recursion encountered
|
||||||
|
|
||||||
|
at /pwd/lang/eval-fail-scope-5.nix:6:12:
|
||||||
|
|
||||||
|
5|
|
||||||
|
6| f = {x ? y, y ? x}: x + y;
|
||||||
|
| ^
|
||||||
|
7|
|
18
tests/lang/eval-fail-seq.err.exp
Normal file
18
tests/lang/eval-fail-seq.err.exp
Normal file
|
@ -0,0 +1,18 @@
|
||||||
|
error:
|
||||||
|
… while calling the 'seq' builtin
|
||||||
|
|
||||||
|
at /pwd/lang/eval-fail-seq.nix:1:1:
|
||||||
|
|
||||||
|
1| builtins.seq (abort "foo") 2
|
||||||
|
| ^
|
||||||
|
2|
|
||||||
|
|
||||||
|
… while calling the 'abort' builtin
|
||||||
|
|
||||||
|
at /pwd/lang/eval-fail-seq.nix:1:15:
|
||||||
|
|
||||||
|
1| builtins.seq (abort "foo") 2
|
||||||
|
| ^
|
||||||
|
2|
|
||||||
|
|
||||||
|
error: evaluation aborted with the following error message: 'foo'
|
6
tests/lang/eval-fail-set-override.err.exp
Normal file
6
tests/lang/eval-fail-set-override.err.exp
Normal file
|
@ -0,0 +1,6 @@
|
||||||
|
error:
|
||||||
|
… while evaluating the `__overrides` attribute
|
||||||
|
|
||||||
|
at «none»:0: (source not available)
|
||||||
|
|
||||||
|
error: value is an integer while a set was expected
|
1
tests/lang/eval-fail-set-override.nix
Normal file
1
tests/lang/eval-fail-set-override.nix
Normal file
|
@ -0,0 +1 @@
|
||||||
|
rec { __overrides = 1; }
|
7
tests/lang/eval-fail-set.err.exp
Normal file
7
tests/lang/eval-fail-set.err.exp
Normal file
|
@ -0,0 +1,7 @@
|
||||||
|
error: undefined variable 'x'
|
||||||
|
|
||||||
|
at /pwd/lang/eval-fail-set.nix:1:3:
|
||||||
|
|
||||||
|
1| 8.x
|
||||||
|
| ^
|
||||||
|
2|
|
1
tests/lang/eval-fail-set.nix
Normal file
1
tests/lang/eval-fail-set.nix
Normal file
|
@ -0,0 +1 @@
|
||||||
|
8.x
|
12
tests/lang/eval-fail-substring.err.exp
Normal file
12
tests/lang/eval-fail-substring.err.exp
Normal file
|
@ -0,0 +1,12 @@
|
||||||
|
error:
|
||||||
|
… while calling the 'substring' builtin
|
||||||
|
|
||||||
|
at /pwd/lang/eval-fail-substring.nix:1:1:
|
||||||
|
|
||||||
|
1| builtins.substring (builtins.sub 0 1) 1 "x"
|
||||||
|
| ^
|
||||||
|
2|
|
||||||
|
|
||||||
|
error: negative start position in 'substring'
|
||||||
|
|
||||||
|
at «none»:0: (source not available)
|
14
tests/lang/eval-fail-to-path.err.exp
Normal file
14
tests/lang/eval-fail-to-path.err.exp
Normal file
|
@ -0,0 +1,14 @@
|
||||||
|
error:
|
||||||
|
… while calling the 'toPath' builtin
|
||||||
|
|
||||||
|
at /pwd/lang/eval-fail-to-path.nix:1:1:
|
||||||
|
|
||||||
|
1| builtins.toPath "foo/bar"
|
||||||
|
| ^
|
||||||
|
2|
|
||||||
|
|
||||||
|
… while evaluating the first argument passed to builtins.toPath
|
||||||
|
|
||||||
|
at «none»:0: (source not available)
|
||||||
|
|
||||||
|
error: string 'foo/bar' doesn't represent an absolute path
|
17
tests/lang/eval-fail-undeclared-arg.err.exp
Normal file
17
tests/lang/eval-fail-undeclared-arg.err.exp
Normal file
|
@ -0,0 +1,17 @@
|
||||||
|
error:
|
||||||
|
… from call site
|
||||||
|
|
||||||
|
at /pwd/lang/eval-fail-undeclared-arg.nix:1:1:
|
||||||
|
|
||||||
|
1| ({x, z}: x + z) {x = "foo"; y = "bla"; z = "bar";}
|
||||||
|
| ^
|
||||||
|
2|
|
||||||
|
|
||||||
|
error: function 'anonymous lambda' called with unexpected argument 'y'
|
||||||
|
|
||||||
|
at /pwd/lang/eval-fail-undeclared-arg.nix:1:2:
|
||||||
|
|
||||||
|
1| ({x, z}: x + z) {x = "foo"; y = "bla"; z = "bar";}
|
||||||
|
| ^
|
||||||
|
2|
|
||||||
|
Did you mean one of x or z?
|
|
@ -11,9 +11,12 @@ builtins.fromJSON
|
||||||
"Width": 200,
|
"Width": 200,
|
||||||
"Height": 250
|
"Height": 250
|
||||||
},
|
},
|
||||||
|
"Animated" : false,
|
||||||
|
"IDs": [116, 943, 234, 38793, true ,false,null, -100],
|
||||||
|
"Escapes": "\"\\\/\t\n\r\t",
|
||||||
"Subtitle" : false,
|
"Subtitle" : false,
|
||||||
"Latitude": 46.2051,
|
"Latitude": 37.7668,
|
||||||
"Longitude": 6.0723
|
"Longitude": -122.3959
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
''
|
''
|
||||||
|
@ -28,8 +31,11 @@ builtins.fromJSON
|
||||||
Width = 200;
|
Width = 200;
|
||||||
Height = 250;
|
Height = 250;
|
||||||
};
|
};
|
||||||
|
Animated = false;
|
||||||
|
IDs = [ 116 943 234 38793 true false null (0-100) ];
|
||||||
|
Escapes = "\"\\\/\t\n\r\t"; # supported in JSON but not Nix: \b\f
|
||||||
Subtitle = false;
|
Subtitle = false;
|
||||||
Latitude = 46.2051;
|
Latitude = 37.7668;
|
||||||
Longitude = 6.0723;
|
Longitude = -122.3959;
|
||||||
};
|
};
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,6 +1,6 @@
|
||||||
let
|
let
|
||||||
|
|
||||||
overrides = { a = 2; };
|
overrides = { a = 2; b = 3; };
|
||||||
|
|
||||||
in (rec {
|
in (rec {
|
||||||
__overrides = overrides;
|
__overrides = overrides;
|
||||||
|
|
1
tests/lang/eval-okay-print.err.exp
Normal file
1
tests/lang/eval-okay-print.err.exp
Normal file
|
@ -0,0 +1 @@
|
||||||
|
trace: [ <CODE> ]
|
1
tests/lang/eval-okay-print.exp
Normal file
1
tests/lang/eval-okay-print.exp
Normal file
|
@ -0,0 +1 @@
|
||||||
|
[ null <PRIMOP> <PRIMOP-APP> <LAMBDA> [ [ «repeated» ] ] ]
|
1
tests/lang/eval-okay-print.nix
Normal file
1
tests/lang/eval-okay-print.nix
Normal file
|
@ -0,0 +1 @@
|
||||||
|
with builtins; trace [(1+1)] [ null toString (deepSeq "x") (a: a) (let x=[x]; in x) ]
|
|
@ -1 +1 @@
|
||||||
-I lang/dir1 -I lang/dir2 -I dir5=lang/dir3
|
-I lang/dir1 -I lang/dir2 -I dir5=lang/dir3
|
||||||
|
|
33
tests/lang/framework.sh
Normal file
33
tests/lang/framework.sh
Normal file
|
@ -0,0 +1,33 @@
|
||||||
|
# Golden test support
|
||||||
|
#
|
||||||
|
# Test that the output of the given test matches what is expected. If
|
||||||
|
# `_NIX_TEST_ACCEPT` is non-empty also update the expected output so
|
||||||
|
# that next time the test succeeds.
|
||||||
|
function diffAndAcceptInner() {
|
||||||
|
local -r testName=$1
|
||||||
|
local -r got="$2"
|
||||||
|
local -r expected="$3"
|
||||||
|
|
||||||
|
# Absence of expected file indicates empty output expected.
|
||||||
|
if test -e "$expected"; then
|
||||||
|
local -r expectedOrEmpty="$expected"
|
||||||
|
else
|
||||||
|
local -r expectedOrEmpty=lang/empty.exp
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Diff so we get a nice message
|
||||||
|
if ! diff --unified "$got" "$expectedOrEmpty"; then
|
||||||
|
echo "FAIL: evaluation result of $testName not as expected"
|
||||||
|
badDiff=1
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Update expected if `_NIX_TEST_ACCEPT` is non-empty.
|
||||||
|
if test -n "${_NIX_TEST_ACCEPT-}"; then
|
||||||
|
cp "$got" "$expected"
|
||||||
|
# Delete empty expected files to avoid bloating the repo with
|
||||||
|
# empty files.
|
||||||
|
if ! test -s "$expected"; then
|
||||||
|
rm "$expected"
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
}
|
7
tests/lang/parse-fail-dup-attrs-1.err.exp
Normal file
7
tests/lang/parse-fail-dup-attrs-1.err.exp
Normal file
|
@ -0,0 +1,7 @@
|
||||||
|
error: attribute 'x' already defined at «stdin»:1:3
|
||||||
|
|
||||||
|
at «stdin»:3:3:
|
||||||
|
|
||||||
|
2| y = 456;
|
||||||
|
3| x = 789;
|
||||||
|
| ^
|
7
tests/lang/parse-fail-dup-attrs-2.err.exp
Normal file
7
tests/lang/parse-fail-dup-attrs-2.err.exp
Normal file
|
@ -0,0 +1,7 @@
|
||||||
|
error: attribute 'x' already defined at «stdin»:9:5
|
||||||
|
|
||||||
|
at «stdin»:10:17:
|
||||||
|
|
||||||
|
9| x = 789;
|
||||||
|
10| inherit (as) x;
|
||||||
|
| ^
|
7
tests/lang/parse-fail-dup-attrs-3.err.exp
Normal file
7
tests/lang/parse-fail-dup-attrs-3.err.exp
Normal file
|
@ -0,0 +1,7 @@
|
||||||
|
error: attribute 'x' already defined at «stdin»:9:5
|
||||||
|
|
||||||
|
at «stdin»:10:17:
|
||||||
|
|
||||||
|
9| x = 789;
|
||||||
|
10| inherit (as) x;
|
||||||
|
| ^
|
7
tests/lang/parse-fail-dup-attrs-4.err.exp
Normal file
7
tests/lang/parse-fail-dup-attrs-4.err.exp
Normal file
|
@ -0,0 +1,7 @@
|
||||||
|
error: attribute 'services.ssh.port' already defined at «stdin»:2:3
|
||||||
|
|
||||||
|
at «stdin»:3:3:
|
||||||
|
|
||||||
|
2| services.ssh.port = 22;
|
||||||
|
3| services.ssh.port = 23;
|
||||||
|
| ^
|
1
tests/lang/parse-fail-dup-attrs-6.err.exp
Normal file
1
tests/lang/parse-fail-dup-attrs-6.err.exp
Normal file
|
@ -0,0 +1 @@
|
||||||
|
error: attribute ‘services.ssh’ at (string):3:3 already defined at (string):2:3
|
7
tests/lang/parse-fail-dup-attrs-7.err.exp
Normal file
7
tests/lang/parse-fail-dup-attrs-7.err.exp
Normal file
|
@ -0,0 +1,7 @@
|
||||||
|
error: attribute 'x' already defined at «stdin»:6:12
|
||||||
|
|
||||||
|
at «stdin»:7:12:
|
||||||
|
|
||||||
|
6| inherit x;
|
||||||
|
7| inherit x;
|
||||||
|
| ^
|
6
tests/lang/parse-fail-dup-formals.err.exp
Normal file
6
tests/lang/parse-fail-dup-formals.err.exp
Normal file
|
@ -0,0 +1,6 @@
|
||||||
|
error: duplicate formal function argument 'x'
|
||||||
|
|
||||||
|
at «stdin»:1:8:
|
||||||
|
|
||||||
|
1| {x, y, x}: x
|
||||||
|
| ^
|
7
tests/lang/parse-fail-eof-in-string.err.exp
Normal file
7
tests/lang/parse-fail-eof-in-string.err.exp
Normal file
|
@ -0,0 +1,7 @@
|
||||||
|
error: syntax error, unexpected end of file, expecting '"'
|
||||||
|
|
||||||
|
at «stdin»:3:5:
|
||||||
|
|
||||||
|
2| # Note that this file must not end with a newline.
|
||||||
|
3| a 1"$
|
||||||
|
| ^
|
8
tests/lang/parse-fail-mixed-nested-attrs1.err.exp
Normal file
8
tests/lang/parse-fail-mixed-nested-attrs1.err.exp
Normal file
|
@ -0,0 +1,8 @@
|
||||||
|
error: attribute 'z' already defined at «stdin»:3:16
|
||||||
|
|
||||||
|
at «stdin»:2:3:
|
||||||
|
|
||||||
|
1| {
|
||||||
|
2| x.z = 3;
|
||||||
|
| ^
|
||||||
|
3| x = { y = 3; z = 3; };
|
8
tests/lang/parse-fail-mixed-nested-attrs2.err.exp
Normal file
8
tests/lang/parse-fail-mixed-nested-attrs2.err.exp
Normal file
|
@ -0,0 +1,8 @@
|
||||||
|
error: attribute 'y' already defined at «stdin»:3:9
|
||||||
|
|
||||||
|
at «stdin»:2:3:
|
||||||
|
|
||||||
|
1| {
|
||||||
|
2| x.y.y = 3;
|
||||||
|
| ^
|
||||||
|
3| x = { y.y= 3; z = 3; };
|
7
tests/lang/parse-fail-patterns-1.err.exp
Normal file
7
tests/lang/parse-fail-patterns-1.err.exp
Normal file
|
@ -0,0 +1,7 @@
|
||||||
|
error: duplicate formal function argument 'args'
|
||||||
|
|
||||||
|
at «stdin»:1:1:
|
||||||
|
|
||||||
|
1| args@{args, x, y, z}: x
|
||||||
|
| ^
|
||||||
|
2|
|
8
tests/lang/parse-fail-regression-20060610.err.exp
Normal file
8
tests/lang/parse-fail-regression-20060610.err.exp
Normal file
|
@ -0,0 +1,8 @@
|
||||||
|
error: undefined variable 'gcc'
|
||||||
|
|
||||||
|
at «stdin»:8:12:
|
||||||
|
|
||||||
|
7|
|
||||||
|
8| body = ({
|
||||||
|
| ^
|
||||||
|
9| inherit gcc;
|
7
tests/lang/parse-fail-undef-var-2.err.exp
Normal file
7
tests/lang/parse-fail-undef-var-2.err.exp
Normal file
|
@ -0,0 +1,7 @@
|
||||||
|
error: syntax error, unexpected ':', expecting '}'
|
||||||
|
|
||||||
|
at «stdin»:3:13:
|
||||||
|
|
||||||
|
2|
|
||||||
|
3| f = {x, y :
|
||||||
|
| ^
|
7
tests/lang/parse-fail-undef-var.err.exp
Normal file
7
tests/lang/parse-fail-undef-var.err.exp
Normal file
|
@ -0,0 +1,7 @@
|
||||||
|
error: undefined variable 'y'
|
||||||
|
|
||||||
|
at «stdin»:1:4:
|
||||||
|
|
||||||
|
1| x: y
|
||||||
|
| ^
|
||||||
|
2|
|
6
tests/lang/parse-fail-utf8.err.exp
Normal file
6
tests/lang/parse-fail-utf8.err.exp
Normal file
|
@ -0,0 +1,6 @@
|
||||||
|
error: syntax error, unexpected invalid token, expecting end of file
|
||||||
|
|
||||||
|
at «stdin»:1:5:
|
||||||
|
|
||||||
|
1| 123 Ã
|
||||||
|
| ^
|
1
tests/lang/parse-okay-1.exp
Normal file
1
tests/lang/parse-okay-1.exp
Normal file
|
@ -0,0 +1 @@
|
||||||
|
({ x, y, z }: ((x + y) + z))
|
1
tests/lang/parse-okay-crlf.exp
Normal file
1
tests/lang/parse-okay-crlf.exp
Normal file
|
@ -0,0 +1 @@
|
||||||
|
rec { foo = "multi\nline\n string\n test\r"; x = y; y = 123; z = 456; }
|
1
tests/lang/parse-okay-dup-attrs-5.exp
Normal file
1
tests/lang/parse-okay-dup-attrs-5.exp
Normal file
|
@ -0,0 +1 @@
|
||||||
|
{ services = { ssh = { enable = true; port = 23; }; }; }
|
1
tests/lang/parse-okay-dup-attrs-6.exp
Normal file
1
tests/lang/parse-okay-dup-attrs-6.exp
Normal file
|
@ -0,0 +1 @@
|
||||||
|
{ services = { ssh = { enable = true; port = 23; }; }; }
|
1
tests/lang/parse-okay-mixed-nested-attrs-1.exp
Normal file
1
tests/lang/parse-okay-mixed-nested-attrs-1.exp
Normal file
|
@ -0,0 +1 @@
|
||||||
|
{ x = { q = 3; y = 3; z = 3; }; }
|
1
tests/lang/parse-okay-mixed-nested-attrs-2.exp
Normal file
1
tests/lang/parse-okay-mixed-nested-attrs-2.exp
Normal file
|
@ -0,0 +1 @@
|
||||||
|
{ x = { q = 3; y = 3; z = 3; }; }
|
1
tests/lang/parse-okay-mixed-nested-attrs-3.exp
Normal file
1
tests/lang/parse-okay-mixed-nested-attrs-3.exp
Normal file
|
@ -0,0 +1 @@
|
||||||
|
{ services = { httpd = { enable = true; }; ssh = { enable = true; port = 123; }; }; }
|
1
tests/lang/parse-okay-regression-20041027.exp
Normal file
1
tests/lang/parse-okay-regression-20041027.exp
Normal file
|
@ -0,0 +1 @@
|
||||||
|
({ fetchurl, stdenv }: ((stdenv).mkDerivation { name = "libXi-6.0.1"; src = (fetchurl { md5 = "7e935a42428d63a387b3c048be0f2756"; url = "http://freedesktop.org/~xlibs/release/libXi-6.0.1.tar.bz2"; }); }))
|
1
tests/lang/parse-okay-regression-751.exp
Normal file
1
tests/lang/parse-okay-regression-751.exp
Normal file
|
@ -0,0 +1 @@
|
||||||
|
(let const = (a: "const"); in ((const { x = "q"; })))
|
1
tests/lang/parse-okay-subversion.exp
Normal file
1
tests/lang/parse-okay-subversion.exp
Normal file
|
@ -0,0 +1 @@
|
||||||
|
({ fetchurl, localServer ? false, httpServer ? false, sslSupport ? false, pythonBindings ? false, javaSwigBindings ? false, javahlBindings ? false, stdenv, openssl ? null, httpd ? null, db4 ? null, expat, swig ? null, j2sdk ? null }: assert (expat != null); assert (localServer -> (db4 != null)); assert (httpServer -> ((httpd != null) && ((httpd).expat == expat))); assert (sslSupport -> ((openssl != null) && (httpServer -> ((httpd).openssl == openssl)))); assert (pythonBindings -> ((swig != null) && (swig).pythonSupport)); assert (javaSwigBindings -> ((swig != null) && (swig).javaSupport)); assert (javahlBindings -> (j2sdk != null)); ((stdenv).mkDerivation { builder = /foo/bar; db4 = (if localServer then db4 else null); inherit expat ; inherit httpServer ; httpd = (if httpServer then httpd else null); j2sdk = (if javaSwigBindings then (swig).j2sdk else (if javahlBindings then j2sdk else null)); inherit javaSwigBindings ; inherit javahlBindings ; inherit localServer ; name = "subversion-1.1.1"; openssl = (if sslSupport then openssl else null); patches = (if javahlBindings then [ (/javahl.patch) ] else [ ]); python = (if pythonBindings then (swig).python else null); inherit pythonBindings ; src = (fetchurl { md5 = "a180c3fe91680389c210c99def54d9e0"; url = "http://subversion.tigris.org/tarballs/subversion-1.1.1.tar.bz2"; }); inherit sslSupport ; swig = (if (pythonBindings || javaSwigBindings) then swig else null); }))
|
1
tests/lang/parse-okay-url.exp
Normal file
1
tests/lang/parse-okay-url.exp
Normal file
|
@ -0,0 +1 @@
|
||||||
|
[ ("x:x") ("https://svn.cs.uu.nl:12443/repos/trace/trunk") ("http://www2.mplayerhq.hu/MPlayer/releases/fonts/font-arial-iso-8859-1.tar.bz2") ("http://losser.st-lab.cs.uu.nl/~armijn/.nix/gcc-3.3.4-static-nix.tar.gz") ("http://fpdownload.macromedia.com/get/shockwave/flash/english/linux/7.0r25/install_flash_player_7_linux.tar.gz") ("https://ftp5.gwdg.de/pub/linux/archlinux/extra/os/x86_64/unzip-6.0-14-x86_64.pkg.tar.zst") ("ftp://ftp.gtk.org/pub/gtk/v1.2/gtk+-1.2.10.tar.gz") ]
|
|
@ -20,6 +20,7 @@ nix_tests = \
|
||||||
remote-store.sh \
|
remote-store.sh \
|
||||||
legacy-ssh-store.sh \
|
legacy-ssh-store.sh \
|
||||||
lang.sh \
|
lang.sh \
|
||||||
|
lang-test-infra.sh \
|
||||||
experimental-features.sh \
|
experimental-features.sh \
|
||||||
fetchMercurial.sh \
|
fetchMercurial.sh \
|
||||||
gc-auto.sh \
|
gc-auto.sh \
|
||||||
|
|
|
@ -24,3 +24,9 @@ eval_stdin_res=$(echo 'let a = {} // a; in a.foo' | nix-instantiate --eval -E -
|
||||||
echo $eval_stdin_res | grep "at «stdin»:1:15:"
|
echo $eval_stdin_res | grep "at «stdin»:1:15:"
|
||||||
echo $eval_stdin_res | grep "infinite recursion encountered"
|
echo $eval_stdin_res | grep "infinite recursion encountered"
|
||||||
|
|
||||||
|
# Attribute path errors
|
||||||
|
expectStderr 1 nix-instantiate --eval -E '{}' -A '"x' | grepQuiet "missing closing quote in selection path"
|
||||||
|
expectStderr 1 nix-instantiate --eval -E '[]' -A 'x' | grepQuiet "should be a set"
|
||||||
|
expectStderr 1 nix-instantiate --eval -E '{}' -A '1' | grepQuiet "should be a list"
|
||||||
|
expectStderr 1 nix-instantiate --eval -E '{}' -A '.' | grepQuiet "empty attribute name"
|
||||||
|
expectStderr 1 nix-instantiate --eval -E '[]' -A '1' | grepQuiet "out of range"
|
||||||
|
|
|
@ -49,3 +49,5 @@ output="$(nix eval --raw --restrict-eval -I "$traverseDir" \
|
||||||
2>&1 || :)"
|
2>&1 || :)"
|
||||||
echo "$output" | grep "is forbidden"
|
echo "$output" | grep "is forbidden"
|
||||||
echo "$output" | grepInverse -F restricted-secret
|
echo "$output" | grepInverse -F restricted-secret
|
||||||
|
|
||||||
|
expectStderr 1 nix-instantiate --restrict-eval true ./dependencies.nix | grepQuiet "forbidden in restricted mode"
|
||||||
|
|
|
@ -84,6 +84,10 @@ info=$(nix path-info --store file://$cacheDir --json $outPath2)
|
||||||
# Copying to a diverted store should fail due to a lack of signatures by trusted keys.
|
# Copying to a diverted store should fail due to a lack of signatures by trusted keys.
|
||||||
chmod -R u+w $TEST_ROOT/store0 || true
|
chmod -R u+w $TEST_ROOT/store0 || true
|
||||||
rm -rf $TEST_ROOT/store0
|
rm -rf $TEST_ROOT/store0
|
||||||
|
|
||||||
|
# Fails or very flaky only on GHA + macOS:
|
||||||
|
# expectStderr 1 nix copy --to $TEST_ROOT/store0 $outPath | grepQuiet -E 'cannot add path .* because it lacks a signature by a trusted key'
|
||||||
|
# but this works:
|
||||||
(! nix copy --to $TEST_ROOT/store0 $outPath)
|
(! nix copy --to $TEST_ROOT/store0 $outPath)
|
||||||
|
|
||||||
# But succeed if we supply the public keys.
|
# But succeed if we supply the public keys.
|
||||||
|
|
|
@ -8,6 +8,10 @@ needLocalStore "The test uses --store always so we would just be bypassing the d
|
||||||
unshare --mount --map-root-user bash <<EOF
|
unshare --mount --map-root-user bash <<EOF
|
||||||
source common.sh
|
source common.sh
|
||||||
|
|
||||||
|
# Avoid store dir being inside sandbox build-dir
|
||||||
|
unset NIX_STORE_DIR
|
||||||
|
unset NIX_STATE_DIR
|
||||||
|
|
||||||
setLocalStore () {
|
setLocalStore () {
|
||||||
export NIX_REMOTE=\$TEST_ROOT/\$1
|
export NIX_REMOTE=\$TEST_ROOT/\$1
|
||||||
mkdir -p \$NIX_REMOTE
|
mkdir -p \$NIX_REMOTE
|
||||||
|
@ -20,14 +24,14 @@ unshare --mount --map-root-user bash <<EOF
|
||||||
setLocalStore store1
|
setLocalStore store1
|
||||||
expectStderr 1 "\${cmd[@]}" | grepQuiet "unable to start build process"
|
expectStderr 1 "\${cmd[@]}" | grepQuiet "unable to start build process"
|
||||||
|
|
||||||
# Fails with `drop-supplementary-groups`
|
# Fails with `require-drop-supplementary-groups`
|
||||||
# TODO better error
|
# TODO better error
|
||||||
setLocalStore store2
|
setLocalStore store2
|
||||||
NIX_CONFIG='drop-supplementary-groups = true' \
|
NIX_CONFIG='require-drop-supplementary-groups = true' \
|
||||||
expectStderr 1 "\${cmd[@]}" | grepQuiet "unable to start build process"
|
expectStderr 1 "\${cmd[@]}" | grepQuiet "unable to start build process"
|
||||||
|
|
||||||
# Works without `drop-supplementary-groups`
|
# Works without `require-drop-supplementary-groups`
|
||||||
setLocalStore store3
|
setLocalStore store3
|
||||||
NIX_CONFIG='drop-supplementary-groups = false' \
|
NIX_CONFIG='require-drop-supplementary-groups = false' \
|
||||||
"\${cmd[@]}"
|
"\${cmd[@]}"
|
||||||
EOF
|
EOF
|
||||||
|
|
Loading…
Reference in a new issue