mirror of
https://github.com/privatevoid-net/nix-super.git
synced 2024-11-30 01:26:15 +02:00
Merge branch 'path-info' into ca-drv-exotic
This commit is contained in:
commit
e68e8e3cee
63 changed files with 2079 additions and 401 deletions
28
.github/PULL_REQUEST_TEMPLATE.md
vendored
Normal file
28
.github/PULL_REQUEST_TEMPLATE.md
vendored
Normal file
|
@ -0,0 +1,28 @@
|
||||||
|
# Motivation
|
||||||
|
<!-- Briefly explain what the change is about and why it is desirable. -->
|
||||||
|
|
||||||
|
# Context
|
||||||
|
<!-- Provide context. Reference open issues if available. -->
|
||||||
|
|
||||||
|
<!-- Non-trivial change: Briefly outline the implementation strategy. -->
|
||||||
|
|
||||||
|
<!-- Invasive change: Discuss alternative designs or approaches you considered. -->
|
||||||
|
|
||||||
|
<!-- Large change: Provide instructions to reviewers how to read the diff. -->
|
||||||
|
|
||||||
|
# Checklist for maintainers
|
||||||
|
|
||||||
|
<!-- Contributors: please leave this as is -->
|
||||||
|
|
||||||
|
Maintainers: tick if completed or explain if not relevant
|
||||||
|
|
||||||
|
- [ ] agreed on idea
|
||||||
|
- [ ] agreed on implementation strategy
|
||||||
|
- [ ] tests, as appropriate
|
||||||
|
- functional tests - `tests/**.sh`
|
||||||
|
- unit tests - `src/*/tests`
|
||||||
|
- integration tests
|
||||||
|
- [ ] documentation in the manual
|
||||||
|
- [ ] code and comments are self-explanatory
|
||||||
|
- [ ] commit message explains why the change was made
|
||||||
|
- [ ] new feature or bug fix: updated release notes
|
2
.version
2
.version
|
@ -1 +1 @@
|
||||||
2.13.0
|
2.14.0
|
|
@ -274,6 +274,12 @@ fi
|
||||||
PKG_CHECK_MODULES([GTEST], [gtest_main])
|
PKG_CHECK_MODULES([GTEST], [gtest_main])
|
||||||
|
|
||||||
|
|
||||||
|
# Look for rapidcheck.
|
||||||
|
# No pkg-config yet, https://github.com/emil-e/rapidcheck/issues/302
|
||||||
|
AC_CHECK_HEADERS([rapidcheck/gtest.h], [], [], [#include <gtest/gtest.h>])
|
||||||
|
AC_CHECK_LIB([rapidcheck], [])
|
||||||
|
|
||||||
|
|
||||||
# Look for nlohmann/json.
|
# Look for nlohmann/json.
|
||||||
PKG_CHECK_MODULES([NLOHMANN_JSON], [nlohmann_json >= 3.9])
|
PKG_CHECK_MODULES([NLOHMANN_JSON], [nlohmann_json >= 3.9])
|
||||||
|
|
||||||
|
|
|
@ -67,6 +67,7 @@
|
||||||
- [CLI guideline](contributing/cli-guideline.md)
|
- [CLI guideline](contributing/cli-guideline.md)
|
||||||
- [Release Notes](release-notes/release-notes.md)
|
- [Release Notes](release-notes/release-notes.md)
|
||||||
- [Release X.Y (202?-??-??)](release-notes/rl-next.md)
|
- [Release X.Y (202?-??-??)](release-notes/rl-next.md)
|
||||||
|
- [Release 2.13 (2023-01-17)](release-notes/rl-2.13.md)
|
||||||
- [Release 2.12 (2022-12-06)](release-notes/rl-2.12.md)
|
- [Release 2.12 (2022-12-06)](release-notes/rl-2.12.md)
|
||||||
- [Release 2.11 (2022-08-25)](release-notes/rl-2.11.md)
|
- [Release 2.11 (2022-08-25)](release-notes/rl-2.11.md)
|
||||||
- [Release 2.10 (2022-07-11)](release-notes/rl-2.10.md)
|
- [Release 2.10 (2022-07-11)](release-notes/rl-2.10.md)
|
||||||
|
|
|
@ -66,11 +66,11 @@ The operation `--realise` essentially “builds” the specified store
|
||||||
paths. Realisation is a somewhat overloaded term:
|
paths. Realisation is a somewhat overloaded term:
|
||||||
|
|
||||||
- If the store path is a *derivation*, realisation ensures that the
|
- If the store path is a *derivation*, realisation ensures that the
|
||||||
output paths of the derivation are [valid](../glossary.md) (i.e.,
|
output paths of the derivation are [valid] (i.e.,
|
||||||
the output path and its closure exist in the file system). This
|
the output path and its closure exist in the file system). This
|
||||||
can be done in several ways. First, it is possible that the
|
can be done in several ways. First, it is possible that the
|
||||||
outputs are already valid, in which case we are done
|
outputs are already valid, in which case we are done
|
||||||
immediately. Otherwise, there may be [substitutes](../glossary.md)
|
immediately. Otherwise, there may be [substitutes]
|
||||||
that produce the outputs (e.g., by downloading them). Finally, the
|
that produce the outputs (e.g., by downloading them). Finally, the
|
||||||
outputs can be produced by running the build task described
|
outputs can be produced by running the build task described
|
||||||
by the derivation.
|
by the derivation.
|
||||||
|
@ -82,6 +82,9 @@ paths. Realisation is a somewhat overloaded term:
|
||||||
produced through substitutes. If there are no (successful)
|
produced through substitutes. If there are no (successful)
|
||||||
substitutes, realisation fails.
|
substitutes, realisation fails.
|
||||||
|
|
||||||
|
[valid]: ../glossary.md#validity
|
||||||
|
[substitutes]: ../glossary.md#substitute
|
||||||
|
|
||||||
The output path of each derivation is printed on standard output. (For
|
The output path of each derivation is printed on standard output. (For
|
||||||
non-derivations argument, the argument itself is printed.)
|
non-derivations argument, the argument itself is printed.)
|
||||||
|
|
||||||
|
@ -295,8 +298,8 @@ error: cannot delete path `/nix/store/zq0h41l75vlb4z45kzgjjmsjxvcv1qk7-mesa-6.4'
|
||||||
|
|
||||||
## Description
|
## Description
|
||||||
|
|
||||||
The operation `--query` displays various bits of information about the
|
The operation `--query` displays information about [store path]s.
|
||||||
store paths . The queries are described below. At most one query can be
|
The queries are described below. At most one query can be
|
||||||
specified. The default query is `--outputs`.
|
specified. The default query is `--outputs`.
|
||||||
|
|
||||||
The paths *paths* may also be symlinks from outside of the Nix store, to
|
The paths *paths* may also be symlinks from outside of the Nix store, to
|
||||||
|
@ -316,12 +319,12 @@ symlink.
|
||||||
## Queries
|
## Queries
|
||||||
|
|
||||||
- `--outputs`\
|
- `--outputs`\
|
||||||
Prints out the [output paths](../glossary.md) of the store
|
Prints out the [output path]s of the store
|
||||||
derivations *paths*. These are the paths that will be produced when
|
derivations *paths*. These are the paths that will be produced when
|
||||||
the derivation is built.
|
the derivation is built.
|
||||||
|
|
||||||
- `--requisites`; `-R`\
|
- `--requisites`; `-R`\
|
||||||
Prints out the [closure](../glossary.md) of the store path *paths*.
|
Prints out the [closure] of the given *paths*.
|
||||||
|
|
||||||
This query has one option:
|
This query has one option:
|
||||||
|
|
||||||
|
@ -338,10 +341,12 @@ symlink.
|
||||||
derivation and specifying the option `--include-outputs`.
|
derivation and specifying the option `--include-outputs`.
|
||||||
|
|
||||||
- `--references`\
|
- `--references`\
|
||||||
Prints the set of [references](../glossary.md) of the store paths
|
Prints the set of [references]s of the store paths
|
||||||
*paths*, that is, their immediate dependencies. (For *all*
|
*paths*, that is, their immediate dependencies. (For *all*
|
||||||
dependencies, use `--requisites`.)
|
dependencies, use `--requisites`.)
|
||||||
|
|
||||||
|
[reference]: ../glossary.md#gloss-reference
|
||||||
|
|
||||||
- `--referrers`\
|
- `--referrers`\
|
||||||
Prints the set of *referrers* of the store paths *paths*, that is,
|
Prints the set of *referrers* of the store paths *paths*, that is,
|
||||||
the store paths currently existing in the Nix store that refer to
|
the store paths currently existing in the Nix store that refer to
|
||||||
|
@ -356,11 +361,13 @@ symlink.
|
||||||
in the Nix store that are dependent on *paths*.
|
in the Nix store that are dependent on *paths*.
|
||||||
|
|
||||||
- `--deriver`; `-d`\
|
- `--deriver`; `-d`\
|
||||||
Prints the [deriver](../glossary.md) of the store paths *paths*. If
|
Prints the [deriver] of the store paths *paths*. If
|
||||||
the path has no deriver (e.g., if it is a source file), or if the
|
the path has no deriver (e.g., if it is a source file), or if the
|
||||||
deriver is not known (e.g., in the case of a binary-only
|
deriver is not known (e.g., in the case of a binary-only
|
||||||
deployment), the string `unknown-deriver` is printed.
|
deployment), the string `unknown-deriver` is printed.
|
||||||
|
|
||||||
|
[deriver]: ../glossary.md#gloss-deriver
|
||||||
|
|
||||||
- `--graph`\
|
- `--graph`\
|
||||||
Prints the references graph of the store paths *paths* in the format
|
Prints the references graph of the store paths *paths* in the format
|
||||||
of the `dot` tool of AT\&T's [Graphviz
|
of the `dot` tool of AT\&T's [Graphviz
|
||||||
|
|
|
@ -92,7 +92,8 @@ $ nix develop
|
||||||
|
|
||||||
The unit-tests for each Nix library (`libexpr`, `libstore`, etc..) are defined
|
The unit-tests for each Nix library (`libexpr`, `libstore`, etc..) are defined
|
||||||
under `src/{library_name}/tests` using the
|
under `src/{library_name}/tests` using the
|
||||||
[googletest](https://google.github.io/googletest/) framework.
|
[googletest](https://google.github.io/googletest/) and
|
||||||
|
[rapidcheck](https://github.com/emil-e/rapidcheck) frameworks.
|
||||||
|
|
||||||
You can run the whole testsuite with `make check`, or the tests for a specific component with `make libfoo-tests_RUN`. Finer-grained filtering is also possible using the [--gtest_filter](https://google.github.io/googletest/advanced.html#running-a-subset-of-the-tests) command-line option.
|
You can run the whole testsuite with `make check`, or the tests for a specific component with `make libfoo-tests_RUN`. Finer-grained filtering is also possible using the [--gtest_filter](https://google.github.io/googletest/advanced.html#running-a-subset-of-the-tests) command-line option.
|
||||||
|
|
||||||
|
|
|
@ -19,6 +19,17 @@
|
||||||
|
|
||||||
[store derivation]: #gloss-store-derivation
|
[store derivation]: #gloss-store-derivation
|
||||||
|
|
||||||
|
- [realise]{#gloss-realise}, realisation\
|
||||||
|
Ensure a [store path] is [valid][validity].
|
||||||
|
|
||||||
|
This means either running the `builder` executable as specified in the corresponding [derivation] or fetching a pre-built [store object] from a [substituter].
|
||||||
|
|
||||||
|
See [`nix-build`](./command-ref/nix-build.md) and [`nix-store --realise`](./command-ref/nix-store.md#operation---realise).
|
||||||
|
|
||||||
|
See [`nix build`](./command-ref/new-cli/nix3-build.md) (experimental).
|
||||||
|
|
||||||
|
[realise]: #gloss-realise
|
||||||
|
|
||||||
- [content-addressed derivation]{#gloss-content-addressed-derivation}\
|
- [content-addressed derivation]{#gloss-content-addressed-derivation}\
|
||||||
A derivation which has the
|
A derivation which has the
|
||||||
[`__contentAddressed`](./language/advanced-attributes.md#adv-attr-__contentAddressed)
|
[`__contentAddressed`](./language/advanced-attributes.md#adv-attr-__contentAddressed)
|
||||||
|
@ -101,6 +112,8 @@
|
||||||
copy store objects it doesn't have. For details, see the
|
copy store objects it doesn't have. For details, see the
|
||||||
[`substituters` option](./command-ref/conf-file.md#conf-substituters).
|
[`substituters` option](./command-ref/conf-file.md#conf-substituters).
|
||||||
|
|
||||||
|
[substituter]: #gloss-substituter
|
||||||
|
|
||||||
- [purity]{#gloss-purity}\
|
- [purity]{#gloss-purity}\
|
||||||
The assumption that equal Nix derivations when run always produce
|
The assumption that equal Nix derivations when run always produce
|
||||||
the same output. This cannot be guaranteed in general (e.g., a
|
the same output. This cannot be guaranteed in general (e.g., a
|
||||||
|
@ -149,13 +162,15 @@
|
||||||
[output path]: #gloss-output-path
|
[output path]: #gloss-output-path
|
||||||
|
|
||||||
- [deriver]{#gloss-deriver}\
|
- [deriver]{#gloss-deriver}\
|
||||||
The deriver of an *output path* is the store
|
The [store derivation] that produced an [output path].
|
||||||
derivation that built it.
|
|
||||||
|
|
||||||
- [validity]{#gloss-validity}\
|
- [validity]{#gloss-validity}\
|
||||||
A store path is considered *valid* if it exists in the file system,
|
A store path is valid if all [store object]s in its [closure] can be read from the [store].
|
||||||
is listed in the Nix database as being valid, and if all paths in
|
|
||||||
its closure are also valid.
|
For a local store, this means:
|
||||||
|
- The store path leads to an existing [store object] in that [store].
|
||||||
|
- The store path is listed in the Nix database as being valid.
|
||||||
|
- All paths in the store path's [closure] are valid.
|
||||||
|
|
||||||
- [user environment]{#gloss-user-env}\
|
- [user environment]{#gloss-user-env}\
|
||||||
An automatically generated store object that consists of a set of
|
An automatically generated store object that consists of a set of
|
||||||
|
|
|
@ -191,12 +191,12 @@ This is an incomplete overview of language features, by example.
|
||||||
<tr>
|
<tr>
|
||||||
<td>
|
<td>
|
||||||
|
|
||||||
<nixpkgs>
|
`<nixpkgs>`
|
||||||
|
|
||||||
</td>
|
</td>
|
||||||
<td>
|
<td>
|
||||||
|
|
||||||
Search path. Value determined by [`$NIX_PATH` environment variable](../command-ref/env-common.md#env-NIX_PATH).
|
Search path for Nix files. Value determined by [`$NIX_PATH` environment variable](../command-ref/env-common.md#env-NIX_PATH).
|
||||||
|
|
||||||
</td>
|
</td>
|
||||||
</tr>
|
</tr>
|
||||||
|
|
|
@ -24,7 +24,7 @@
|
||||||
| [Equality] | *expr* `==` *expr* | none | 11 |
|
| [Equality] | *expr* `==` *expr* | none | 11 |
|
||||||
| Inequality | *expr* `!=` *expr* | none | 11 |
|
| Inequality | *expr* `!=` *expr* | none | 11 |
|
||||||
| Logical conjunction (`AND`) | *bool* `&&` *bool* | left | 12 |
|
| Logical conjunction (`AND`) | *bool* `&&` *bool* | left | 12 |
|
||||||
| Logical disjunction (`OR`) | *bool* `||` *bool* | left | 13 |
|
| Logical disjunction (`OR`) | *bool* `\|\|` *bool* | left | 13 |
|
||||||
| [Logical implication] | *bool* `->` *bool* | none | 14 |
|
| [Logical implication] | *bool* `->` *bool* | none | 14 |
|
||||||
|
|
||||||
[string]: ./values.md#type-string
|
[string]: ./values.md#type-string
|
||||||
|
@ -120,12 +120,12 @@ The result is a string.
|
||||||
|
|
||||||
## Update
|
## Update
|
||||||
|
|
||||||
> *attrset1* + *attrset2*
|
> *attrset1* // *attrset2*
|
||||||
|
|
||||||
Update [attribute set] *attrset1* with names and values from *attrset2*.
|
Update [attribute set] *attrset1* with names and values from *attrset2*.
|
||||||
|
|
||||||
The returned attribute set will have of all the attributes in *e1* and *e2*.
|
The returned attribute set will have of all the attributes in *attrset1* and *attrset2*.
|
||||||
If an attribute name is present in both, the attribute value from the former is taken.
|
If an attribute name is present in both, the attribute value from the latter is taken.
|
||||||
|
|
||||||
[Update]: #update
|
[Update]: #update
|
||||||
|
|
||||||
|
|
|
@ -32,13 +32,13 @@ which should print something like:
|
||||||
Priority: 30
|
Priority: 30
|
||||||
|
|
||||||
On the client side, you can tell Nix to use your binary cache using
|
On the client side, you can tell Nix to use your binary cache using
|
||||||
`--option extra-binary-caches`, e.g.:
|
`--substituters`, e.g.:
|
||||||
|
|
||||||
```console
|
```console
|
||||||
$ nix-env -iA nixpkgs.firefox --option extra-binary-caches http://avalon:8080/
|
$ nix-env -iA nixpkgs.firefox --substituters http://avalon:8080/
|
||||||
```
|
```
|
||||||
|
|
||||||
The option `extra-binary-caches` tells Nix to use this binary cache in
|
The option `substituters` tells Nix to use this binary cache in
|
||||||
addition to your default caches, such as <https://cache.nixos.org>.
|
addition to your default caches, such as <https://cache.nixos.org>.
|
||||||
Thus, for any path in the closure of Firefox, Nix will first check if
|
Thus, for any path in the closure of Firefox, Nix will first check if
|
||||||
the path is available on the server `avalon` or another binary caches.
|
the path is available on the server `avalon` or another binary caches.
|
||||||
|
@ -47,4 +47,4 @@ If not, it will fall back to building from source.
|
||||||
You can also tell Nix to always use your binary cache by adding a line
|
You can also tell Nix to always use your binary cache by adding a line
|
||||||
to the `nix.conf` configuration file like this:
|
to the `nix.conf` configuration file like this:
|
||||||
|
|
||||||
binary-caches = http://avalon:8080/ https://cache.nixos.org/
|
substituters = http://avalon:8080/ https://cache.nixos.org/
|
||||||
|
|
44
doc/manual/src/release-notes/rl-2.13.md
Normal file
44
doc/manual/src/release-notes/rl-2.13.md
Normal file
|
@ -0,0 +1,44 @@
|
||||||
|
# Release 2.13 (2023-01-17)
|
||||||
|
|
||||||
|
* The `repeat` and `enforce-determinism` options have been removed
|
||||||
|
since they had been broken under many circumstances for a long time.
|
||||||
|
|
||||||
|
* You can now use [flake references] in the [old command line interface], e.g.
|
||||||
|
|
||||||
|
[flake references]: ../command-ref/new-cli/nix3-flake.md#flake-references
|
||||||
|
[old command line interface]: ../command-ref/main-commands.md
|
||||||
|
|
||||||
|
```shell-session
|
||||||
|
# nix-build flake:nixpkgs -A hello
|
||||||
|
# nix-build -I nixpkgs=flake:github:NixOS/nixpkgs/nixos-22.05 \
|
||||||
|
'<nixpkgs>' -A hello
|
||||||
|
# NIX_PATH=nixpkgs=flake:nixpkgs nix-build '<nixpkgs>' -A hello
|
||||||
|
```
|
||||||
|
|
||||||
|
* Instead of "antiquotation", the more common term [string interpolation](../language/string-interpolation.md) is now used consistently.
|
||||||
|
Historical release notes were not changed.
|
||||||
|
|
||||||
|
* Error traces have been reworked to provide detailed explanations and more
|
||||||
|
accurate error locations. A short excerpt of the trace is now shown by
|
||||||
|
default when an error occurs.
|
||||||
|
|
||||||
|
* Allow explicitly selecting outputs in a store derivation installable, just like we can do with other sorts of installables.
|
||||||
|
For example,
|
||||||
|
```shell-session
|
||||||
|
# nix-build /nix/store/gzaflydcr6sb3567hap9q6srzx8ggdgg-glibc-2.33-78.drv^dev
|
||||||
|
```
|
||||||
|
now works just as
|
||||||
|
```shell-session
|
||||||
|
# nix-build glibc^dev
|
||||||
|
```
|
||||||
|
does already.
|
||||||
|
|
||||||
|
* On Linux, `nix develop` now sets the
|
||||||
|
[*personality*](https://man7.org/linux/man-pages/man2/personality.2.html)
|
||||||
|
for the development shell in the same way as the actual build of the
|
||||||
|
derivation. This makes shells for `i686-linux` derivations work
|
||||||
|
correctly on `x86_64-linux`.
|
||||||
|
|
||||||
|
* You can now disable the global flake registry by setting the `flake-registry`
|
||||||
|
configuration option to an empty string. The same can be achieved at runtime with
|
||||||
|
`--flake-registry ""`.
|
|
@ -1,23 +1,10 @@
|
||||||
# Release X.Y (202?-??-??)
|
# Release X.Y (202?-??-??)
|
||||||
|
|
||||||
* The `repeat` and `enforce-determinism` options have been removed
|
* A new function `builtins.readFileType` is available. It is similar to
|
||||||
since they had been broken under many circumstances for a long time.
|
`builtins.readDir` but acts on a single file or directory.
|
||||||
|
|
||||||
* You can now use [flake references] in the [old command line interface], e.g.
|
* The `builtins.readDir` function has been optimized when encountering not-yet-known
|
||||||
|
file types from POSIX's `readdir`. In such cases the type of each file is/was
|
||||||
[flake references]: ../command-ref/new-cli/nix3-flake.md#flake-references
|
discovered by making multiple syscalls. This change makes these operations
|
||||||
[old command line interface]: ../command-ref/main-commands.md
|
lazy such that these lookups will only be performed if the attribute is used.
|
||||||
|
This optimization affects a minority of filesystems and operating systems.
|
||||||
```
|
|
||||||
# nix-build flake:nixpkgs -A hello
|
|
||||||
# nix-build -I nixpkgs=flake:github:NixOS/nixpkgs/nixos-22.05 \
|
|
||||||
'<nixpkgs>' -A hello
|
|
||||||
# NIX_PATH=nixpkgs=flake:nixpkgs nix-build '<nixpkgs>' -A hello
|
|
||||||
```
|
|
||||||
|
|
||||||
* Instead of "antiquotation", the more common term [string interpolation](../language/string-interpolation.md) is now used consistently.
|
|
||||||
Historical release notes were not changed.
|
|
||||||
|
|
||||||
* Error traces have been reworked to provide detailed explanations and more
|
|
||||||
accurate error locations. A short excerpt of the trace is now shown by
|
|
||||||
default when an error occurs.
|
|
||||||
|
|
12
flake.nix
12
flake.nix
|
@ -82,7 +82,9 @@
|
||||||
});
|
});
|
||||||
|
|
||||||
configureFlags =
|
configureFlags =
|
||||||
lib.optionals stdenv.isLinux [
|
[
|
||||||
|
"CXXFLAGS=-I${lib.getDev rapidcheck}/extras/gtest/include"
|
||||||
|
] ++ lib.optionals stdenv.isLinux [
|
||||||
"--with-boost=${boost}/lib"
|
"--with-boost=${boost}/lib"
|
||||||
"--with-sandbox-shell=${sh}/bin/busybox"
|
"--with-sandbox-shell=${sh}/bin/busybox"
|
||||||
]
|
]
|
||||||
|
@ -116,6 +118,7 @@
|
||||||
boost
|
boost
|
||||||
lowdown-nix
|
lowdown-nix
|
||||||
gtest
|
gtest
|
||||||
|
rapidcheck
|
||||||
]
|
]
|
||||||
++ lib.optionals stdenv.isLinux [libseccomp]
|
++ lib.optionals stdenv.isLinux [libseccomp]
|
||||||
++ lib.optional (stdenv.isLinux || stdenv.isDarwin) libsodium
|
++ lib.optional (stdenv.isLinux || stdenv.isDarwin) libsodium
|
||||||
|
@ -532,6 +535,12 @@
|
||||||
mkdir $out
|
mkdir $out
|
||||||
'';
|
'';
|
||||||
|
|
||||||
|
tests.nixpkgsLibTests =
|
||||||
|
nixpkgs.lib.genAttrs systems (system:
|
||||||
|
import (nixpkgs + "/lib/tests/release.nix")
|
||||||
|
{ pkgs = nixpkgsFor.${system}; }
|
||||||
|
);
|
||||||
|
|
||||||
metrics.nixpkgs = import "${nixpkgs-regression}/pkgs/top-level/metrics.nix" {
|
metrics.nixpkgs = import "${nixpkgs-regression}/pkgs/top-level/metrics.nix" {
|
||||||
pkgs = nixpkgsFor.x86_64-linux;
|
pkgs = nixpkgsFor.x86_64-linux;
|
||||||
nixpkgs = nixpkgs-regression;
|
nixpkgs = nixpkgs-regression;
|
||||||
|
@ -562,6 +571,7 @@
|
||||||
binaryTarball = self.hydraJobs.binaryTarball.${system};
|
binaryTarball = self.hydraJobs.binaryTarball.${system};
|
||||||
perlBindings = self.hydraJobs.perlBindings.${system};
|
perlBindings = self.hydraJobs.perlBindings.${system};
|
||||||
installTests = self.hydraJobs.installTests.${system};
|
installTests = self.hydraJobs.installTests.${system};
|
||||||
|
nixpkgsLibTests = self.hydraJobs.tests.nixpkgsLibTests.${system};
|
||||||
} // (nixpkgs.lib.optionalAttrs (builtins.elem system linux64BitSystems)) {
|
} // (nixpkgs.lib.optionalAttrs (builtins.elem system linux64BitSystems)) {
|
||||||
dockerImage = self.hydraJobs.dockerImage.${system};
|
dockerImage = self.hydraJobs.dockerImage.${system};
|
||||||
});
|
});
|
||||||
|
|
|
@ -24,12 +24,17 @@ $1
|
||||||
EOF
|
EOF
|
||||||
}
|
}
|
||||||
|
|
||||||
|
escape_systemd_env() {
|
||||||
|
temp_var="${1//\'/\\\'}"
|
||||||
|
echo "${temp_var//\%/%%}"
|
||||||
|
}
|
||||||
|
|
||||||
# Gather all non-empty proxy environment variables into a string
|
# Gather all non-empty proxy environment variables into a string
|
||||||
create_systemd_proxy_env() {
|
create_systemd_proxy_env() {
|
||||||
vars="http_proxy https_proxy ftp_proxy no_proxy HTTP_PROXY HTTPS_PROXY FTP_PROXY NO_PROXY"
|
vars="http_proxy https_proxy ftp_proxy no_proxy HTTP_PROXY HTTPS_PROXY FTP_PROXY NO_PROXY"
|
||||||
for v in $vars; do
|
for v in $vars; do
|
||||||
if [ "x${!v:-}" != "x" ]; then
|
if [ "x${!v:-}" != "x" ]; then
|
||||||
echo "Environment=${v}=${!v}"
|
echo "Environment=${v}=$(escape_systemd_env ${!v})"
|
||||||
fi
|
fi
|
||||||
done
|
done
|
||||||
}
|
}
|
||||||
|
|
|
@ -397,7 +397,7 @@ StringSet NixRepl::completePrefix(const std::string & prefix)
|
||||||
Expr * e = parseString(expr);
|
Expr * e = parseString(expr);
|
||||||
Value v;
|
Value v;
|
||||||
e->eval(*state, *env, v);
|
e->eval(*state, *env, v);
|
||||||
state->forceAttrs(v, noPos, "nevermind, it is ignored anyway");
|
state->forceAttrs(v, noPos, "while evaluating an attrset for the purpose of completion (this error should not be displayed; file an issue?)");
|
||||||
|
|
||||||
for (auto & i : *v.attrs) {
|
for (auto & i : *v.attrs) {
|
||||||
std::string_view name = state->symbols[i.name];
|
std::string_view name = state->symbols[i.name];
|
||||||
|
|
|
@ -11,7 +11,9 @@
|
||||||
|
|
||||||
#include <algorithm>
|
#include <algorithm>
|
||||||
#include <chrono>
|
#include <chrono>
|
||||||
|
#include <iostream>
|
||||||
#include <cstring>
|
#include <cstring>
|
||||||
|
#include <optional>
|
||||||
#include <unistd.h>
|
#include <unistd.h>
|
||||||
#include <sys/time.h>
|
#include <sys/time.h>
|
||||||
#include <sys/resource.h>
|
#include <sys/resource.h>
|
||||||
|
@ -1927,7 +1929,9 @@ void ExprConcatStrings::eval(EvalState & state, Env & env, Value & v)
|
||||||
/* skip canonization of first path, which would only be not
|
/* skip canonization of first path, which would only be not
|
||||||
canonized in the first place if it's coming from a ./${foo} type
|
canonized in the first place if it's coming from a ./${foo} type
|
||||||
path */
|
path */
|
||||||
auto part = state.coerceToString(i_pos, vTmp, context, false, firstType == nString, !first, "while evaluating a path segment");
|
auto part = state.coerceToString(i_pos, vTmp, context,
|
||||||
|
"while evaluating a path segment",
|
||||||
|
false, firstType == nString, !first);
|
||||||
sSize += part->size();
|
sSize += part->size();
|
||||||
s.emplace_back(std::move(part));
|
s.emplace_back(std::move(part));
|
||||||
}
|
}
|
||||||
|
@ -2123,15 +2127,16 @@ std::optional<std::string> EvalState::tryAttrsToString(const PosIdx pos, Value &
|
||||||
if (i != v.attrs->end()) {
|
if (i != v.attrs->end()) {
|
||||||
Value v1;
|
Value v1;
|
||||||
callFunction(*i->value, v, v1, pos);
|
callFunction(*i->value, v, v1, pos);
|
||||||
return coerceToString(pos, v1, context, coerceMore, copyToStore,
|
return coerceToString(pos, v1, context,
|
||||||
"while evaluating the result of the `toString` attribute").toOwned();
|
"while evaluating the result of the `__toString` attribute",
|
||||||
|
coerceMore, copyToStore).toOwned();
|
||||||
}
|
}
|
||||||
|
|
||||||
return {};
|
return {};
|
||||||
}
|
}
|
||||||
|
|
||||||
BackedStringView EvalState::coerceToString(const PosIdx pos, Value & v, PathSet & context,
|
BackedStringView EvalState::coerceToString(const PosIdx pos, Value &v, PathSet &context,
|
||||||
bool coerceMore, bool copyToStore, bool canonicalizePath, std::string_view errorCtx)
|
std::string_view errorCtx, bool coerceMore, bool copyToStore, bool canonicalizePath)
|
||||||
{
|
{
|
||||||
forceValue(v, pos);
|
forceValue(v, pos);
|
||||||
|
|
||||||
|
@ -2154,13 +2159,23 @@ BackedStringView EvalState::coerceToString(const PosIdx pos, Value & v, PathSet
|
||||||
if (maybeString)
|
if (maybeString)
|
||||||
return std::move(*maybeString);
|
return std::move(*maybeString);
|
||||||
auto i = v.attrs->find(sOutPath);
|
auto i = v.attrs->find(sOutPath);
|
||||||
if (i == v.attrs->end())
|
if (i == v.attrs->end()) {
|
||||||
error("cannot coerce a set to a string", showType(v)).withTrace(pos, errorCtx).debugThrow<TypeError>();
|
error("cannot coerce %1% to a string", showType(v))
|
||||||
return coerceToString(pos, *i->value, context, coerceMore, copyToStore, canonicalizePath, errorCtx);
|
.withTrace(pos, errorCtx)
|
||||||
|
.debugThrow<TypeError>();
|
||||||
|
}
|
||||||
|
return coerceToString(pos, *i->value, context, errorCtx,
|
||||||
|
coerceMore, copyToStore, canonicalizePath);
|
||||||
}
|
}
|
||||||
|
|
||||||
if (v.type() == nExternal)
|
if (v.type() == nExternal) {
|
||||||
return v.external->coerceToString(positions[pos], context, coerceMore, copyToStore, errorCtx);
|
try {
|
||||||
|
return v.external->coerceToString(positions[pos], context, coerceMore, copyToStore);
|
||||||
|
} catch (Error & e) {
|
||||||
|
e.addTrace(nullptr, errorCtx);
|
||||||
|
throw;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
if (coerceMore) {
|
if (coerceMore) {
|
||||||
/* Note that `false' is represented as an empty string for
|
/* Note that `false' is represented as an empty string for
|
||||||
|
@ -2175,8 +2190,9 @@ BackedStringView EvalState::coerceToString(const PosIdx pos, Value & v, PathSet
|
||||||
std::string result;
|
std::string result;
|
||||||
for (auto [n, v2] : enumerate(v.listItems())) {
|
for (auto [n, v2] : enumerate(v.listItems())) {
|
||||||
try {
|
try {
|
||||||
result += *coerceToString(noPos, *v2, context, coerceMore, copyToStore, canonicalizePath,
|
result += *coerceToString(noPos, *v2, context,
|
||||||
"while evaluating one element of the list");
|
"while evaluating one element of the list",
|
||||||
|
coerceMore, copyToStore, canonicalizePath);
|
||||||
} catch (Error & e) {
|
} catch (Error & e) {
|
||||||
e.addTrace(positions[pos], errorCtx);
|
e.addTrace(positions[pos], errorCtx);
|
||||||
throw;
|
throw;
|
||||||
|
@ -2190,7 +2206,9 @@ BackedStringView EvalState::coerceToString(const PosIdx pos, Value & v, PathSet
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
error("cannot coerce %1% to a string", showType(v)).withTrace(pos, errorCtx).debugThrow<TypeError>();
|
error("cannot coerce %1% to a string", showType(v))
|
||||||
|
.withTrace(pos, errorCtx)
|
||||||
|
.debugThrow<TypeError>();
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@ -2220,7 +2238,7 @@ StorePath EvalState::copyPathToStore(PathSet & context, const Path & path)
|
||||||
|
|
||||||
Path EvalState::coerceToPath(const PosIdx pos, Value & v, PathSet & context, std::string_view errorCtx)
|
Path EvalState::coerceToPath(const PosIdx pos, Value & v, PathSet & context, std::string_view errorCtx)
|
||||||
{
|
{
|
||||||
auto path = coerceToString(pos, v, context, false, false, true, errorCtx).toOwned();
|
auto path = coerceToString(pos, v, context, errorCtx, false, false, true).toOwned();
|
||||||
if (path == "" || path[0] != '/')
|
if (path == "" || path[0] != '/')
|
||||||
error("string '%1%' doesn't represent an absolute path", path).withTrace(pos, errorCtx).debugThrow<EvalError>();
|
error("string '%1%' doesn't represent an absolute path", path).withTrace(pos, errorCtx).debugThrow<EvalError>();
|
||||||
return path;
|
return path;
|
||||||
|
@ -2229,7 +2247,7 @@ Path EvalState::coerceToPath(const PosIdx pos, Value & v, PathSet & context, std
|
||||||
|
|
||||||
StorePath EvalState::coerceToStorePath(const PosIdx pos, Value & v, PathSet & context, std::string_view errorCtx)
|
StorePath EvalState::coerceToStorePath(const PosIdx pos, Value & v, PathSet & context, std::string_view errorCtx)
|
||||||
{
|
{
|
||||||
auto path = coerceToString(pos, v, context, false, false, true, errorCtx).toOwned();
|
auto path = coerceToString(pos, v, context, errorCtx, false, false, true).toOwned();
|
||||||
if (auto storePath = store->maybeParseStorePath(path))
|
if (auto storePath = store->maybeParseStorePath(path))
|
||||||
return *storePath;
|
return *storePath;
|
||||||
error("path '%1%' is not in the Nix store", path).withTrace(pos, errorCtx).debugThrow<EvalError>();
|
error("path '%1%' is not in the Nix store", path).withTrace(pos, errorCtx).debugThrow<EvalError>();
|
||||||
|
@ -2433,13 +2451,11 @@ void EvalState::printStats()
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
std::string ExternalValueBase::coerceToString(const Pos & pos, PathSet & context, bool copyMore, bool copyToStore, std::string_view errorCtx) const
|
std::string ExternalValueBase::coerceToString(const Pos & pos, PathSet & context, bool copyMore, bool copyToStore) const
|
||||||
{
|
{
|
||||||
auto e = TypeError({
|
throw TypeError({
|
||||||
.msg = hintfmt("cannot coerce %1% to a string", showType())
|
.msg = hintfmt("cannot coerce %1% to a string", showType())
|
||||||
});
|
});
|
||||||
e.addTrace(pos, errorCtx);
|
|
||||||
throw e;
|
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
|
|
|
@ -203,6 +203,9 @@ public:
|
||||||
throw std::move(error);
|
throw std::move(error);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// This is dangerous, but gets in line with the idea that error creation and
|
||||||
|
// throwing should not allocate on the stack of hot functions.
|
||||||
|
// as long as errors are immediately thrown, it works.
|
||||||
ErrorBuilder * errorBuilder;
|
ErrorBuilder * errorBuilder;
|
||||||
|
|
||||||
template<typename... Args>
|
template<typename... Args>
|
||||||
|
@ -375,9 +378,9 @@ public:
|
||||||
booleans and lists to a string. If `copyToStore' is set,
|
booleans and lists to a string. If `copyToStore' is set,
|
||||||
referenced paths are copied to the Nix store as a side effect. */
|
referenced paths are copied to the Nix store as a side effect. */
|
||||||
BackedStringView coerceToString(const PosIdx pos, Value & v, PathSet & context,
|
BackedStringView coerceToString(const PosIdx pos, Value & v, PathSet & context,
|
||||||
|
std::string_view errorCtx,
|
||||||
bool coerceMore = false, bool copyToStore = true,
|
bool coerceMore = false, bool copyToStore = true,
|
||||||
bool canonicalizePath = true,
|
bool canonicalizePath = true);
|
||||||
std::string_view errorCtx = "");
|
|
||||||
|
|
||||||
StorePath copyPathToStore(PathSet & context, const Path & path);
|
StorePath copyPathToStore(PathSet & context, const Path & path);
|
||||||
|
|
||||||
|
|
|
@ -264,7 +264,7 @@ static Flake getFlake(
|
||||||
PathSet emptyContext = {};
|
PathSet emptyContext = {};
|
||||||
flake.config.settings.emplace(
|
flake.config.settings.emplace(
|
||||||
state.symbols[setting.name],
|
state.symbols[setting.name],
|
||||||
state.coerceToString(setting.pos, *setting.value, emptyContext, false, true, true, "") .toOwned());
|
state.coerceToString(setting.pos, *setting.value, emptyContext, "", false, true, true) .toOwned());
|
||||||
}
|
}
|
||||||
else if (setting.value->type() == nInt)
|
else if (setting.value->type() == nInt)
|
||||||
flake.config.settings.emplace(
|
flake.config.settings.emplace(
|
||||||
|
|
|
@ -350,26 +350,22 @@ void prim_exec(EvalState & state, const PosIdx pos, Value * * args, Value & v)
|
||||||
auto elems = args[0]->listElems();
|
auto elems = args[0]->listElems();
|
||||||
auto count = args[0]->listSize();
|
auto count = args[0]->listSize();
|
||||||
if (count == 0)
|
if (count == 0)
|
||||||
state.debugThrowLastTrace(EvalError({
|
state.error("at least one argument to 'exec' required").atPos(pos).debugThrow<EvalError>();
|
||||||
.msg = hintfmt("at least one argument to 'exec' required"),
|
|
||||||
.errPos = state.positions[pos]
|
|
||||||
}));
|
|
||||||
PathSet context;
|
PathSet context;
|
||||||
auto program = state.coerceToString(pos, *elems[0], context, false, false,
|
auto program = state.coerceToString(pos, *elems[0], context,
|
||||||
"while evaluating the first element of the argument passed to builtins.exec").toOwned();
|
"while evaluating the first element of the argument passed to builtins.exec",
|
||||||
|
false, false).toOwned();
|
||||||
Strings commandArgs;
|
Strings commandArgs;
|
||||||
for (unsigned int i = 1; i < args[0]->listSize(); ++i) {
|
for (unsigned int i = 1; i < args[0]->listSize(); ++i) {
|
||||||
commandArgs.push_back(state.coerceToString(pos, *elems[i], context, false, false,
|
commandArgs.push_back(
|
||||||
"while evaluating an element of the argument passed to builtins.exec").toOwned());
|
state.coerceToString(pos, *elems[i], context,
|
||||||
|
"while evaluating an element of the argument passed to builtins.exec",
|
||||||
|
false, false).toOwned());
|
||||||
}
|
}
|
||||||
try {
|
try {
|
||||||
auto _ = state.realiseContext(context); // FIXME: Handle CA derivations
|
auto _ = state.realiseContext(context); // FIXME: Handle CA derivations
|
||||||
} catch (InvalidPathError & e) {
|
} catch (InvalidPathError & e) {
|
||||||
state.debugThrowLastTrace(EvalError({
|
state.error("cannot execute '%1%', since path '%2%' is not valid", program, e.path).atPos(pos).debugThrow<EvalError>();
|
||||||
.msg = hintfmt("cannot execute '%1%', since path '%2%' is not valid",
|
|
||||||
program, e.path),
|
|
||||||
.errPos = state.positions[pos]
|
|
||||||
}));
|
|
||||||
}
|
}
|
||||||
|
|
||||||
auto output = runProgram(program, true, commandArgs);
|
auto output = runProgram(program, true, commandArgs);
|
||||||
|
@ -598,6 +594,7 @@ struct CompareValues
|
||||||
state.error("cannot compare %s with %s; values of that type are incomparable", showType(*v1), showType(*v2)).debugThrow<EvalError>();
|
state.error("cannot compare %s with %s; values of that type are incomparable", showType(*v1), showType(*v2)).debugThrow<EvalError>();
|
||||||
}
|
}
|
||||||
} catch (Error & e) {
|
} catch (Error & e) {
|
||||||
|
if (!errorCtx.empty())
|
||||||
e.addTrace(nullptr, errorCtx);
|
e.addTrace(nullptr, errorCtx);
|
||||||
throw;
|
throw;
|
||||||
}
|
}
|
||||||
|
@ -620,15 +617,7 @@ static Bindings::iterator getAttr(
|
||||||
{
|
{
|
||||||
Bindings::iterator value = attrSet->find(attrSym);
|
Bindings::iterator value = attrSet->find(attrSym);
|
||||||
if (value == attrSet->end()) {
|
if (value == attrSet->end()) {
|
||||||
throw TypeError({
|
state.error("attribute '%s' missing", state.symbols[attrSym]).withTrace(noPos, errorCtx).debugThrow<TypeError>();
|
||||||
.msg = hintfmt("attribute '%s' missing %s", state.symbols[attrSym], normaltxt(errorCtx)),
|
|
||||||
.errPos = state.positions[attrSet->pos],
|
|
||||||
});
|
|
||||||
// TODO XXX
|
|
||||||
// Adding another trace for the function name to make it clear
|
|
||||||
// which call received wrong arguments.
|
|
||||||
//e.addTrace(state.positions[pos], hintfmt("while invoking '%s'", funcName));
|
|
||||||
//state.debugThrowLastTrace(e);
|
|
||||||
}
|
}
|
||||||
return value;
|
return value;
|
||||||
}
|
}
|
||||||
|
@ -801,8 +790,10 @@ static void prim_addErrorContext(EvalState & state, const PosIdx pos, Value * *
|
||||||
v = *args[1];
|
v = *args[1];
|
||||||
} catch (Error & e) {
|
} catch (Error & e) {
|
||||||
PathSet context;
|
PathSet context;
|
||||||
e.addTrace(nullptr, state.coerceToString(pos, *args[0], context,
|
auto message = state.coerceToString(pos, *args[0], context,
|
||||||
"while evaluating the error message passed to builtins.addErrorContext").toOwned());
|
"while evaluating the error message passed to builtins.addErrorContext",
|
||||||
|
false, false).toOwned();
|
||||||
|
e.addTrace(nullptr, message, true);
|
||||||
throw;
|
throw;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
@ -1006,6 +997,7 @@ static void prim_second(EvalState & state, const PosIdx pos, Value * * args, Val
|
||||||
* Derivations
|
* Derivations
|
||||||
*************************************************************/
|
*************************************************************/
|
||||||
|
|
||||||
|
static void derivationStrictInternal(EvalState & state, const std::string & name, Bindings * attrs, Value & v);
|
||||||
|
|
||||||
/* Construct (as a unobservable side effect) a Nix derivation
|
/* Construct (as a unobservable side effect) a Nix derivation
|
||||||
expression that performs the derivation described by the argument
|
expression that performs the derivation described by the argument
|
||||||
|
@ -1016,32 +1008,68 @@ static void prim_second(EvalState & state, const PosIdx pos, Value * * args, Val
|
||||||
derivation. */
|
derivation. */
|
||||||
static void prim_derivationStrict(EvalState & state, const PosIdx pos, Value * * args, Value & v)
|
static void prim_derivationStrict(EvalState & state, const PosIdx pos, Value * * args, Value & v)
|
||||||
{
|
{
|
||||||
using nlohmann::json;
|
|
||||||
state.forceAttrs(*args[0], pos, "while evaluating the argument passed to builtins.derivationStrict");
|
state.forceAttrs(*args[0], pos, "while evaluating the argument passed to builtins.derivationStrict");
|
||||||
|
|
||||||
|
Bindings * attrs = args[0]->attrs;
|
||||||
|
|
||||||
/* Figure out the name first (for stack backtraces). */
|
/* Figure out the name first (for stack backtraces). */
|
||||||
Bindings::iterator attr = getAttr(state, state.sName, args[0]->attrs, "in the attrset passed as argument to builtins.derivationStrict");
|
Bindings::iterator nameAttr = getAttr(state, state.sName, attrs, "in the attrset passed as argument to builtins.derivationStrict");
|
||||||
|
|
||||||
std::string drvName;
|
std::string drvName;
|
||||||
const auto posDrvName = attr->pos;
|
|
||||||
try {
|
try {
|
||||||
drvName = state.forceStringNoCtx(*attr->value, pos, "while evaluating the `name` attribute passed to builtins.derivationStrict");
|
drvName = state.forceStringNoCtx(*nameAttr->value, pos, "while evaluating the `name` attribute passed to builtins.derivationStrict");
|
||||||
} catch (Error & e) {
|
} catch (Error & e) {
|
||||||
e.addTrace(state.positions[posDrvName], "while evaluating the derivation attribute 'name'");
|
e.addTrace(state.positions[nameAttr->pos], "while evaluating the derivation attribute 'name'");
|
||||||
throw;
|
throw;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
try {
|
||||||
|
derivationStrictInternal(state, drvName, attrs, v);
|
||||||
|
} catch (Error & e) {
|
||||||
|
Pos pos = state.positions[nameAttr->pos];
|
||||||
|
/*
|
||||||
|
* Here we make two abuses of the error system
|
||||||
|
*
|
||||||
|
* 1. We print the location as a string to avoid a code snippet being
|
||||||
|
* printed. While the location of the name attribute is a good hint, the
|
||||||
|
* exact code there is irrelevant.
|
||||||
|
*
|
||||||
|
* 2. We mark this trace as a frame trace, meaning that we stop printing
|
||||||
|
* less important traces from now on. In particular, this prevents the
|
||||||
|
* display of the automatic "while calling builtins.derivationStrict"
|
||||||
|
* trace, which is of little use for the public we target here.
|
||||||
|
*
|
||||||
|
* Please keep in mind that error reporting is done on a best-effort
|
||||||
|
* basis in nix. There is no accurate location for a derivation, as it
|
||||||
|
* often results from the composition of several functions
|
||||||
|
* (derivationStrict, derivation, mkDerivation, mkPythonModule, etc.)
|
||||||
|
*/
|
||||||
|
e.addTrace(nullptr, hintfmt(
|
||||||
|
"while evaluating derivation '%s'\n"
|
||||||
|
" whose name attribute is located at %s",
|
||||||
|
drvName, pos), true);
|
||||||
|
throw;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
static void derivationStrictInternal(EvalState & state, const std::string &
|
||||||
|
drvName, Bindings * attrs, Value & v)
|
||||||
|
{
|
||||||
/* Check whether attributes should be passed as a JSON file. */
|
/* Check whether attributes should be passed as a JSON file. */
|
||||||
|
using nlohmann::json;
|
||||||
std::optional<json> jsonObject;
|
std::optional<json> jsonObject;
|
||||||
attr = args[0]->attrs->find(state.sStructuredAttrs);
|
auto attr = attrs->find(state.sStructuredAttrs);
|
||||||
if (attr != args[0]->attrs->end() && state.forceBool(*attr->value, pos, "while evaluating the `__structuredAttrs` attribute passed to builtins.derivationStrict"))
|
if (attr != attrs->end() &&
|
||||||
|
state.forceBool(*attr->value, noPos,
|
||||||
|
"while evaluating the `__structuredAttrs` "
|
||||||
|
"attribute passed to builtins.derivationStrict"))
|
||||||
jsonObject = json::object();
|
jsonObject = json::object();
|
||||||
|
|
||||||
/* Check whether null attributes should be ignored. */
|
/* Check whether null attributes should be ignored. */
|
||||||
bool ignoreNulls = false;
|
bool ignoreNulls = false;
|
||||||
attr = args[0]->attrs->find(state.sIgnoreNulls);
|
attr = attrs->find(state.sIgnoreNulls);
|
||||||
if (attr != args[0]->attrs->end())
|
if (attr != attrs->end())
|
||||||
ignoreNulls = state.forceBool(*attr->value, pos, "while evaluating the `__ignoreNulls` attribute passed to builtins.derivationStrict");
|
ignoreNulls = state.forceBool(*attr->value, noPos, "while evaluating the `__ignoreNulls` attribute " "passed to builtins.derivationStrict");
|
||||||
|
|
||||||
/* Build the derivation expression by processing the attributes. */
|
/* Build the derivation expression by processing the attributes. */
|
||||||
Derivation drv;
|
Derivation drv;
|
||||||
|
@ -1058,7 +1086,7 @@ static void prim_derivationStrict(EvalState & state, const PosIdx pos, Value * *
|
||||||
StringSet outputs;
|
StringSet outputs;
|
||||||
outputs.insert("out");
|
outputs.insert("out");
|
||||||
|
|
||||||
for (auto & i : args[0]->attrs->lexicographicOrder(state.symbols)) {
|
for (auto & i : attrs->lexicographicOrder(state.symbols)) {
|
||||||
if (i->name == state.sIgnoreNulls) continue;
|
if (i->name == state.sIgnoreNulls) continue;
|
||||||
const std::string & key = state.symbols[i->name];
|
const std::string & key = state.symbols[i->name];
|
||||||
vomit("processing attribute '%1%'", key);
|
vomit("processing attribute '%1%'", key);
|
||||||
|
@ -1070,7 +1098,7 @@ static void prim_derivationStrict(EvalState & state, const PosIdx pos, Value * *
|
||||||
else
|
else
|
||||||
state.debugThrowLastTrace(EvalError({
|
state.debugThrowLastTrace(EvalError({
|
||||||
.msg = hintfmt("invalid value '%s' for 'outputHashMode' attribute", s),
|
.msg = hintfmt("invalid value '%s' for 'outputHashMode' attribute", s),
|
||||||
.errPos = state.positions[posDrvName]
|
.errPos = state.positions[noPos]
|
||||||
}));
|
}));
|
||||||
};
|
};
|
||||||
|
|
||||||
|
@ -1080,7 +1108,7 @@ static void prim_derivationStrict(EvalState & state, const PosIdx pos, Value * *
|
||||||
if (outputs.find(j) != outputs.end())
|
if (outputs.find(j) != outputs.end())
|
||||||
state.debugThrowLastTrace(EvalError({
|
state.debugThrowLastTrace(EvalError({
|
||||||
.msg = hintfmt("duplicate derivation output '%1%'", j),
|
.msg = hintfmt("duplicate derivation output '%1%'", j),
|
||||||
.errPos = state.positions[posDrvName]
|
.errPos = state.positions[noPos]
|
||||||
}));
|
}));
|
||||||
/* !!! Check whether j is a valid attribute
|
/* !!! Check whether j is a valid attribute
|
||||||
name. */
|
name. */
|
||||||
|
@ -1090,34 +1118,35 @@ static void prim_derivationStrict(EvalState & state, const PosIdx pos, Value * *
|
||||||
if (j == "drv")
|
if (j == "drv")
|
||||||
state.debugThrowLastTrace(EvalError({
|
state.debugThrowLastTrace(EvalError({
|
||||||
.msg = hintfmt("invalid derivation output name 'drv'" ),
|
.msg = hintfmt("invalid derivation output name 'drv'" ),
|
||||||
.errPos = state.positions[posDrvName]
|
.errPos = state.positions[noPos]
|
||||||
}));
|
}));
|
||||||
outputs.insert(j);
|
outputs.insert(j);
|
||||||
}
|
}
|
||||||
if (outputs.empty())
|
if (outputs.empty())
|
||||||
state.debugThrowLastTrace(EvalError({
|
state.debugThrowLastTrace(EvalError({
|
||||||
.msg = hintfmt("derivation cannot have an empty set of outputs"),
|
.msg = hintfmt("derivation cannot have an empty set of outputs"),
|
||||||
.errPos = state.positions[posDrvName]
|
.errPos = state.positions[noPos]
|
||||||
}));
|
}));
|
||||||
};
|
};
|
||||||
|
|
||||||
try {
|
try {
|
||||||
|
// This try-catch block adds context for most errors.
|
||||||
|
// Use this empty error context to signify that we defer to it.
|
||||||
|
const std::string_view context_below("");
|
||||||
|
|
||||||
if (ignoreNulls) {
|
if (ignoreNulls) {
|
||||||
state.forceValue(*i->value, pos);
|
state.forceValue(*i->value, noPos);
|
||||||
if (i->value->type() == nNull) continue;
|
if (i->value->type() == nNull) continue;
|
||||||
}
|
}
|
||||||
|
|
||||||
if (i->name == state.sContentAddressed) {
|
if (i->name == state.sContentAddressed) {
|
||||||
contentAddressed = state.forceBool(*i->value, pos,
|
contentAddressed = state.forceBool(*i->value, noPos, context_below);
|
||||||
"while evaluating the `__contentAddressed` attribute passed to builtins.derivationStrict");
|
|
||||||
if (contentAddressed)
|
if (contentAddressed)
|
||||||
settings.requireExperimentalFeature(Xp::CaDerivations);
|
settings.requireExperimentalFeature(Xp::CaDerivations);
|
||||||
}
|
}
|
||||||
|
|
||||||
else if (i->name == state.sImpure) {
|
else if (i->name == state.sImpure) {
|
||||||
isImpure = state.forceBool(*i->value, pos,
|
isImpure = state.forceBool(*i->value, noPos, context_below);
|
||||||
"while evaluating the 'impure' attribute passed to builtins.derivationStrict");
|
|
||||||
if (isImpure)
|
if (isImpure)
|
||||||
settings.requireExperimentalFeature(Xp::ImpureDerivations);
|
settings.requireExperimentalFeature(Xp::ImpureDerivations);
|
||||||
}
|
}
|
||||||
|
@ -1125,11 +1154,11 @@ static void prim_derivationStrict(EvalState & state, const PosIdx pos, Value * *
|
||||||
/* The `args' attribute is special: it supplies the
|
/* The `args' attribute is special: it supplies the
|
||||||
command-line arguments to the builder. */
|
command-line arguments to the builder. */
|
||||||
else if (i->name == state.sArgs) {
|
else if (i->name == state.sArgs) {
|
||||||
state.forceList(*i->value, pos,
|
state.forceList(*i->value, noPos, context_below);
|
||||||
"while evaluating the `args` attribute passed to builtins.derivationStrict");
|
|
||||||
for (auto elem : i->value->listItems()) {
|
for (auto elem : i->value->listItems()) {
|
||||||
auto s = state.coerceToString(posDrvName, *elem, context, true,
|
auto s = state.coerceToString(noPos, *elem, context,
|
||||||
"while evaluating an element of the `args` argument passed to builtins.derivationStrict").toOwned();
|
"while evaluating an element of the argument list",
|
||||||
|
true).toOwned();
|
||||||
drv.args.push_back(s);
|
drv.args.push_back(s);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
@ -1142,29 +1171,29 @@ static void prim_derivationStrict(EvalState & state, const PosIdx pos, Value * *
|
||||||
|
|
||||||
if (i->name == state.sStructuredAttrs) continue;
|
if (i->name == state.sStructuredAttrs) continue;
|
||||||
|
|
||||||
(*jsonObject)[key] = printValueAsJSON(state, true, *i->value, pos, context);
|
(*jsonObject)[key] = printValueAsJSON(state, true, *i->value, noPos, context);
|
||||||
|
|
||||||
if (i->name == state.sBuilder)
|
if (i->name == state.sBuilder)
|
||||||
drv.builder = state.forceString(*i->value, context, posDrvName, "while evaluating the `builder` attribute passed to builtins.derivationStrict");
|
drv.builder = state.forceString(*i->value, context, noPos, context_below);
|
||||||
else if (i->name == state.sSystem)
|
else if (i->name == state.sSystem)
|
||||||
drv.platform = state.forceStringNoCtx(*i->value, posDrvName, "while evaluating the `system` attribute passed to builtins.derivationStrict");
|
drv.platform = state.forceStringNoCtx(*i->value, noPos, context_below);
|
||||||
else if (i->name == state.sOutputHash)
|
else if (i->name == state.sOutputHash)
|
||||||
outputHash = state.forceStringNoCtx(*i->value, posDrvName, "while evaluating the `outputHash` attribute passed to builtins.derivationStrict");
|
outputHash = state.forceStringNoCtx(*i->value, noPos, context_below);
|
||||||
else if (i->name == state.sOutputHashAlgo)
|
else if (i->name == state.sOutputHashAlgo)
|
||||||
outputHashAlgo = state.forceStringNoCtx(*i->value, posDrvName, "while evaluating the `outputHashAlgo` attribute passed to builtins.derivationStrict");
|
outputHashAlgo = state.forceStringNoCtx(*i->value, noPos, context_below);
|
||||||
else if (i->name == state.sOutputHashMode)
|
else if (i->name == state.sOutputHashMode)
|
||||||
handleHashMode(state.forceStringNoCtx(*i->value, posDrvName, "while evaluating the `outputHashMode` attribute passed to builtins.derivationStrict"));
|
handleHashMode(state.forceStringNoCtx(*i->value, noPos, context_below));
|
||||||
else if (i->name == state.sOutputs) {
|
else if (i->name == state.sOutputs) {
|
||||||
/* Require ‘outputs’ to be a list of strings. */
|
/* Require ‘outputs’ to be a list of strings. */
|
||||||
state.forceList(*i->value, posDrvName, "while evaluating the `outputs` attribute passed to builtins.derivationStrict");
|
state.forceList(*i->value, noPos, context_below);
|
||||||
Strings ss;
|
Strings ss;
|
||||||
for (auto elem : i->value->listItems())
|
for (auto elem : i->value->listItems())
|
||||||
ss.emplace_back(state.forceStringNoCtx(*elem, posDrvName, "while evaluating an element of the `outputs` attribute passed to builtins.derivationStrict"));
|
ss.emplace_back(state.forceStringNoCtx(*elem, noPos, context_below));
|
||||||
handleOutputs(ss);
|
handleOutputs(ss);
|
||||||
}
|
}
|
||||||
|
|
||||||
} else {
|
} else {
|
||||||
auto s = state.coerceToString(i->pos, *i->value, context, true, "while evaluating an attribute passed to builtins.derivationStrict").toOwned();
|
auto s = state.coerceToString(noPos, *i->value, context, context_below, true).toOwned();
|
||||||
drv.env.emplace(key, s);
|
drv.env.emplace(key, s);
|
||||||
if (i->name == state.sBuilder) drv.builder = std::move(s);
|
if (i->name == state.sBuilder) drv.builder = std::move(s);
|
||||||
else if (i->name == state.sSystem) drv.platform = std::move(s);
|
else if (i->name == state.sSystem) drv.platform = std::move(s);
|
||||||
|
@ -1178,8 +1207,8 @@ static void prim_derivationStrict(EvalState & state, const PosIdx pos, Value * *
|
||||||
}
|
}
|
||||||
|
|
||||||
} catch (Error & e) {
|
} catch (Error & e) {
|
||||||
e.addTrace(nullptr,
|
e.addTrace(state.positions[i->pos],
|
||||||
hintfmt("while evaluating the attribute '%1%' of the derivation '%2%'", key, drvName),
|
hintfmt("while evaluating attribute '%1%' of derivation '%2%'", key, drvName),
|
||||||
true);
|
true);
|
||||||
throw;
|
throw;
|
||||||
}
|
}
|
||||||
|
@ -1224,20 +1253,20 @@ static void prim_derivationStrict(EvalState & state, const PosIdx pos, Value * *
|
||||||
if (drv.builder == "")
|
if (drv.builder == "")
|
||||||
state.debugThrowLastTrace(EvalError({
|
state.debugThrowLastTrace(EvalError({
|
||||||
.msg = hintfmt("required attribute 'builder' missing"),
|
.msg = hintfmt("required attribute 'builder' missing"),
|
||||||
.errPos = state.positions[posDrvName]
|
.errPos = state.positions[noPos]
|
||||||
}));
|
}));
|
||||||
|
|
||||||
if (drv.platform == "")
|
if (drv.platform == "")
|
||||||
state.debugThrowLastTrace(EvalError({
|
state.debugThrowLastTrace(EvalError({
|
||||||
.msg = hintfmt("required attribute 'system' missing"),
|
.msg = hintfmt("required attribute 'system' missing"),
|
||||||
.errPos = state.positions[posDrvName]
|
.errPos = state.positions[noPos]
|
||||||
}));
|
}));
|
||||||
|
|
||||||
/* Check whether the derivation name is valid. */
|
/* Check whether the derivation name is valid. */
|
||||||
if (isDerivation(drvName) && ingestionMethod != ContentAddressMethod { TextHashMethod { } })
|
if (isDerivation(drvName) && ingestionMethod != ContentAddressMethod { TextHashMethod { } })
|
||||||
state.debugThrowLastTrace(EvalError({
|
state.debugThrowLastTrace(EvalError({
|
||||||
.msg = hintfmt("derivation names are allowed to end in '%s' only if they produce a single derivation file", drvExtension),
|
.msg = hintfmt("derivation names are allowed to end in '%s' only if they produce a single derivation file", drvExtension),
|
||||||
.errPos = state.positions[posDrvName]
|
.errPos = state.positions[noPos]
|
||||||
}));
|
}));
|
||||||
|
|
||||||
if (outputHash) {
|
if (outputHash) {
|
||||||
|
@ -1248,7 +1277,7 @@ static void prim_derivationStrict(EvalState & state, const PosIdx pos, Value * *
|
||||||
if (outputs.size() != 1 || *(outputs.begin()) != "out")
|
if (outputs.size() != 1 || *(outputs.begin()) != "out")
|
||||||
state.debugThrowLastTrace(Error({
|
state.debugThrowLastTrace(Error({
|
||||||
.msg = hintfmt("multiple outputs are not supported in fixed-output derivations"),
|
.msg = hintfmt("multiple outputs are not supported in fixed-output derivations"),
|
||||||
.errPos = state.positions[posDrvName]
|
.errPos = state.positions[noPos]
|
||||||
}));
|
}));
|
||||||
|
|
||||||
auto h = newHashAllowEmpty(*outputHash, parseHashTypeOpt(outputHashAlgo));
|
auto h = newHashAllowEmpty(*outputHash, parseHashTypeOpt(outputHashAlgo));
|
||||||
|
@ -1268,7 +1297,7 @@ static void prim_derivationStrict(EvalState & state, const PosIdx pos, Value * *
|
||||||
if (contentAddressed && isImpure)
|
if (contentAddressed && isImpure)
|
||||||
throw EvalError({
|
throw EvalError({
|
||||||
.msg = hintfmt("derivation cannot be both content-addressed and impure"),
|
.msg = hintfmt("derivation cannot be both content-addressed and impure"),
|
||||||
.errPos = state.positions[posDrvName]
|
.errPos = state.positions[noPos]
|
||||||
});
|
});
|
||||||
|
|
||||||
auto ht = parseHashTypeOpt(outputHashAlgo).value_or(htSHA256);
|
auto ht = parseHashTypeOpt(outputHashAlgo).value_or(htSHA256);
|
||||||
|
@ -1312,7 +1341,7 @@ static void prim_derivationStrict(EvalState & state, const PosIdx pos, Value * *
|
||||||
if (!h)
|
if (!h)
|
||||||
throw AssertionError({
|
throw AssertionError({
|
||||||
.msg = hintfmt("derivation produced no hash for output '%s'", i),
|
.msg = hintfmt("derivation produced no hash for output '%s'", i),
|
||||||
.errPos = state.positions[posDrvName],
|
.errPos = state.positions[noPos],
|
||||||
});
|
});
|
||||||
auto outPath = state.store->makeOutputPath(i, *h, drvName);
|
auto outPath = state.store->makeOutputPath(i, *h, drvName);
|
||||||
drv.env[i] = state.store->printStorePath(outPath);
|
drv.env[i] = state.store->printStorePath(outPath);
|
||||||
|
@ -1345,11 +1374,12 @@ static void prim_derivationStrict(EvalState & state, const PosIdx pos, Value * *
|
||||||
drvHashes.lock()->insert_or_assign(drvPath, h);
|
drvHashes.lock()->insert_or_assign(drvPath, h);
|
||||||
}
|
}
|
||||||
|
|
||||||
auto attrs = state.buildBindings(1 + drv.outputs.size());
|
auto result = state.buildBindings(1 + drv.outputs.size());
|
||||||
attrs.alloc(state.sDrvPath).mkString(drvPathS, {"=" + drvPathS});
|
result.alloc(state.sDrvPath).mkString(drvPathS, {"=" + drvPathS});
|
||||||
for (auto & i : drv.outputs)
|
for (auto & i : drv.outputs)
|
||||||
mkOutputString(state, attrs, drvPath, drv, i);
|
mkOutputString(state, result, drvPath, drv, i);
|
||||||
v.mkAttrs(attrs);
|
|
||||||
|
v.mkAttrs(result);
|
||||||
}
|
}
|
||||||
|
|
||||||
static RegisterPrimOp primop_derivationStrict(RegisterPrimOp::Info {
|
static RegisterPrimOp primop_derivationStrict(RegisterPrimOp::Info {
|
||||||
|
@ -1492,7 +1522,9 @@ static RegisterPrimOp primop_pathExists({
|
||||||
static void prim_baseNameOf(EvalState & state, const PosIdx pos, Value * * args, Value & v)
|
static void prim_baseNameOf(EvalState & state, const PosIdx pos, Value * * args, Value & v)
|
||||||
{
|
{
|
||||||
PathSet context;
|
PathSet context;
|
||||||
v.mkString(baseNameOf(*state.coerceToString(pos, *args[0], context, false, false, "while evaluating the first argument passed to builtins.baseNameOf")), context);
|
v.mkString(baseNameOf(*state.coerceToString(pos, *args[0], context,
|
||||||
|
"while evaluating the first argument passed to builtins.baseNameOf",
|
||||||
|
false, false)), context);
|
||||||
}
|
}
|
||||||
|
|
||||||
static RegisterPrimOp primop_baseNameOf({
|
static RegisterPrimOp primop_baseNameOf({
|
||||||
|
@ -1512,7 +1544,9 @@ static RegisterPrimOp primop_baseNameOf({
|
||||||
static void prim_dirOf(EvalState & state, const PosIdx pos, Value * * args, Value & v)
|
static void prim_dirOf(EvalState & state, const PosIdx pos, Value * * args, Value & v)
|
||||||
{
|
{
|
||||||
PathSet context;
|
PathSet context;
|
||||||
auto path = state.coerceToString(pos, *args[0], context, false, false, "while evaluating the first argument passed to builtins.dirOf");
|
auto path = state.coerceToString(pos, *args[0], context,
|
||||||
|
"while evaluating the first argument passed to builtins.dirOf",
|
||||||
|
false, false);
|
||||||
auto dir = dirOf(*path);
|
auto dir = dirOf(*path);
|
||||||
if (args[0]->type() == nPath) v.mkPath(dir); else v.mkString(dir, context);
|
if (args[0]->type() == nPath) v.mkPath(dir); else v.mkString(dir, context);
|
||||||
}
|
}
|
||||||
|
@ -1578,8 +1612,9 @@ static void prim_findFile(EvalState & state, const PosIdx pos, Value * * args, V
|
||||||
i = getAttr(state, state.sPath, v2->attrs, "in an element of the __nixPath");
|
i = getAttr(state, state.sPath, v2->attrs, "in an element of the __nixPath");
|
||||||
|
|
||||||
PathSet context;
|
PathSet context;
|
||||||
auto path = state.coerceToString(pos, *i->value, context, false, false,
|
auto path = state.coerceToString(pos, *i->value, context,
|
||||||
"while evaluating the `path` attribute of an element of the list passed to builtins.findFile").toOwned();
|
"while evaluating the `path` attribute of an element of the list passed to builtins.findFile",
|
||||||
|
false, false).toOwned();
|
||||||
|
|
||||||
try {
|
try {
|
||||||
auto rewrites = state.realiseContext(context);
|
auto rewrites = state.realiseContext(context);
|
||||||
|
@ -1632,23 +1667,73 @@ static RegisterPrimOp primop_hashFile({
|
||||||
.fun = prim_hashFile,
|
.fun = prim_hashFile,
|
||||||
});
|
});
|
||||||
|
|
||||||
|
|
||||||
|
/* Stringize a directory entry enum. Used by `readFileType' and `readDir'. */
|
||||||
|
static const char * dirEntTypeToString(unsigned char dtType)
|
||||||
|
{
|
||||||
|
/* Enum DT_(DIR|LNK|REG|UNKNOWN) */
|
||||||
|
switch(dtType) {
|
||||||
|
case DT_REG: return "regular"; break;
|
||||||
|
case DT_DIR: return "directory"; break;
|
||||||
|
case DT_LNK: return "symlink"; break;
|
||||||
|
default: return "unknown"; break;
|
||||||
|
}
|
||||||
|
return "unknown"; /* Unreachable */
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
static void prim_readFileType(EvalState & state, const PosIdx pos, Value * * args, Value & v)
|
||||||
|
{
|
||||||
|
auto path = realisePath(state, pos, *args[0]);
|
||||||
|
/* Retrieve the directory entry type and stringize it. */
|
||||||
|
v.mkString(dirEntTypeToString(getFileType(path)));
|
||||||
|
}
|
||||||
|
|
||||||
|
static RegisterPrimOp primop_readFileType({
|
||||||
|
.name = "__readFileType",
|
||||||
|
.args = {"p"},
|
||||||
|
.doc = R"(
|
||||||
|
Determine the directory entry type of a filesystem node, being
|
||||||
|
one of "directory", "regular", "symlink", or "unknown".
|
||||||
|
)",
|
||||||
|
.fun = prim_readFileType,
|
||||||
|
});
|
||||||
|
|
||||||
/* Read a directory (without . or ..) */
|
/* Read a directory (without . or ..) */
|
||||||
static void prim_readDir(EvalState & state, const PosIdx pos, Value * * args, Value & v)
|
static void prim_readDir(EvalState & state, const PosIdx pos, Value * * args, Value & v)
|
||||||
{
|
{
|
||||||
auto path = realisePath(state, pos, *args[0]);
|
auto path = realisePath(state, pos, *args[0]);
|
||||||
|
|
||||||
|
// Retrieve directory entries for all nodes in a directory.
|
||||||
|
// This is similar to `getFileType` but is optimized to reduce system calls
|
||||||
|
// on many systems.
|
||||||
DirEntries entries = readDirectory(path);
|
DirEntries entries = readDirectory(path);
|
||||||
|
|
||||||
auto attrs = state.buildBindings(entries.size());
|
auto attrs = state.buildBindings(entries.size());
|
||||||
|
|
||||||
|
// If we hit unknown directory entry types we may need to fallback to
|
||||||
|
// using `getFileType` on some systems.
|
||||||
|
// In order to reduce system calls we make each lookup lazy by using
|
||||||
|
// `builtins.readFileType` application.
|
||||||
|
Value * readFileType = nullptr;
|
||||||
|
|
||||||
for (auto & ent : entries) {
|
for (auto & ent : entries) {
|
||||||
if (ent.type == DT_UNKNOWN)
|
auto & attr = attrs.alloc(ent.name);
|
||||||
ent.type = getFileType(path + "/" + ent.name);
|
if (ent.type == DT_UNKNOWN) {
|
||||||
attrs.alloc(ent.name).mkString(
|
// Some filesystems or operating systems may not be able to return
|
||||||
ent.type == DT_REG ? "regular" :
|
// detailed node info quickly in this case we produce a thunk to
|
||||||
ent.type == DT_DIR ? "directory" :
|
// query the file type lazily.
|
||||||
ent.type == DT_LNK ? "symlink" :
|
auto epath = state.allocValue();
|
||||||
"unknown");
|
Path path2 = path + "/" + ent.name;
|
||||||
|
epath->mkString(path2);
|
||||||
|
if (!readFileType)
|
||||||
|
readFileType = &state.getBuiltin("readFileType");
|
||||||
|
attr.mkApp(readFileType, epath);
|
||||||
|
} else {
|
||||||
|
// This branch of the conditional is much more likely.
|
||||||
|
// Here we just stringize the directory entry type.
|
||||||
|
attr.mkString(dirEntTypeToString(ent.type));
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
v.mkAttrs(attrs);
|
v.mkAttrs(attrs);
|
||||||
|
@ -2628,14 +2713,9 @@ static void prim_zipAttrsWith(EvalState & state, const PosIdx pos, Value * * arg
|
||||||
|
|
||||||
for (unsigned int n = 0; n < listSize; ++n) {
|
for (unsigned int n = 0; n < listSize; ++n) {
|
||||||
Value * vElem = listElems[n];
|
Value * vElem = listElems[n];
|
||||||
try {
|
|
||||||
state.forceAttrs(*vElem, noPos, "while evaluating a value of the list passed as second argument to builtins.zipAttrsWith");
|
state.forceAttrs(*vElem, noPos, "while evaluating a value of the list passed as second argument to builtins.zipAttrsWith");
|
||||||
for (auto & attr : *vElem->attrs)
|
for (auto & attr : *vElem->attrs)
|
||||||
attrsSeen[attr.name].first++;
|
attrsSeen[attr.name].first++;
|
||||||
} catch (TypeError & e) {
|
|
||||||
e.addTrace(state.positions[pos], hintfmt("while invoking '%s'", "zipAttrsWith"));
|
|
||||||
state.debugThrowLastTrace(e);
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
auto attrs = state.buildBindings(attrsSeen.size());
|
auto attrs = state.buildBindings(attrsSeen.size());
|
||||||
|
@ -3019,13 +3099,13 @@ static void prim_genList(EvalState & state, const PosIdx pos, Value * * args, Va
|
||||||
auto len = state.forceInt(*args[1], pos, "while evaluating the second argument passed to builtins.genList");
|
auto len = state.forceInt(*args[1], pos, "while evaluating the second argument passed to builtins.genList");
|
||||||
|
|
||||||
if (len < 0)
|
if (len < 0)
|
||||||
state.debugThrowLastTrace(EvalError({
|
state.error("cannot create list of size %1%", len).debugThrow<EvalError>();
|
||||||
.msg = hintfmt("cannot create list of size %1%", len),
|
|
||||||
.errPos = state.positions[pos]
|
// More strict than striclty (!) necessary, but acceptable
|
||||||
}));
|
// as evaluating map without accessing any values makes little sense.
|
||||||
|
state.forceFunction(*args[0], noPos, "while evaluating the first argument passed to builtins.genList");
|
||||||
|
|
||||||
state.mkList(v, len);
|
state.mkList(v, len);
|
||||||
|
|
||||||
for (unsigned int n = 0; n < (unsigned int) len; ++n) {
|
for (unsigned int n = 0; n < (unsigned int) len; ++n) {
|
||||||
auto arg = state.allocValue();
|
auto arg = state.allocValue();
|
||||||
arg->mkInt(n);
|
arg->mkInt(n);
|
||||||
|
@ -3073,6 +3153,8 @@ static void prim_sort(EvalState & state, const PosIdx pos, Value * * args, Value
|
||||||
auto comparator = [&](Value * a, Value * b) {
|
auto comparator = [&](Value * a, Value * b) {
|
||||||
/* Optimization: if the comparator is lessThan, bypass
|
/* Optimization: if the comparator is lessThan, bypass
|
||||||
callFunction. */
|
callFunction. */
|
||||||
|
/* TODO: (layus) this is absurd. An optimisation like this
|
||||||
|
should be outside the lambda creation */
|
||||||
if (args[0]->isPrimOp() && args[0]->primOp->fun == prim_lessThan)
|
if (args[0]->isPrimOp() && args[0]->primOp->fun == prim_lessThan)
|
||||||
return CompareValues(state, noPos, "while evaluating the ordering function passed to builtins.sort")(a, b);
|
return CompareValues(state, noPos, "while evaluating the ordering function passed to builtins.sort")(a, b);
|
||||||
|
|
||||||
|
@ -3233,12 +3315,7 @@ static void prim_concatMap(EvalState & state, const PosIdx pos, Value * * args,
|
||||||
for (unsigned int n = 0; n < nrLists; ++n) {
|
for (unsigned int n = 0; n < nrLists; ++n) {
|
||||||
Value * vElem = args[1]->listElems()[n];
|
Value * vElem = args[1]->listElems()[n];
|
||||||
state.callFunction(*args[0], *vElem, lists[n], pos);
|
state.callFunction(*args[0], *vElem, lists[n], pos);
|
||||||
try {
|
|
||||||
state.forceList(lists[n], lists[n].determinePos(args[0]->determinePos(pos)), "while evaluating the return value of the function passed to buitlins.concatMap");
|
state.forceList(lists[n], lists[n].determinePos(args[0]->determinePos(pos)), "while evaluating the return value of the function passed to buitlins.concatMap");
|
||||||
} catch (TypeError &e) {
|
|
||||||
e.addTrace(state.positions[pos], hintfmt("while invoking '%s'", "concatMap"));
|
|
||||||
state.debugThrowLastTrace(e);
|
|
||||||
}
|
|
||||||
len += lists[n].listSize();
|
len += lists[n].listSize();
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -3418,7 +3495,7 @@ static void prim_lessThan(EvalState & state, const PosIdx pos, Value * * args, V
|
||||||
state.forceValue(*args[0], pos);
|
state.forceValue(*args[0], pos);
|
||||||
state.forceValue(*args[1], pos);
|
state.forceValue(*args[1], pos);
|
||||||
// pos is exact here, no need for a message.
|
// pos is exact here, no need for a message.
|
||||||
CompareValues comp(state, pos, "");
|
CompareValues comp(state, noPos, "");
|
||||||
v.mkBool(comp(args[0], args[1]));
|
v.mkBool(comp(args[0], args[1]));
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -3445,7 +3522,9 @@ static RegisterPrimOp primop_lessThan({
|
||||||
static void prim_toString(EvalState & state, const PosIdx pos, Value * * args, Value & v)
|
static void prim_toString(EvalState & state, const PosIdx pos, Value * * args, Value & v)
|
||||||
{
|
{
|
||||||
PathSet context;
|
PathSet context;
|
||||||
auto s = state.coerceToString(pos, *args[0], context, true, false, "while evaluating the first argument passed to builtins.toString");
|
auto s = state.coerceToString(pos, *args[0], context,
|
||||||
|
"while evaluating the first argument passed to builtins.toString",
|
||||||
|
true, false);
|
||||||
v.mkString(*s, context);
|
v.mkString(*s, context);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -3797,21 +3876,18 @@ static void prim_replaceStrings(EvalState & state, const PosIdx pos, Value * * a
|
||||||
state.forceList(*args[0], pos, "while evaluating the first argument passed to builtins.replaceStrings");
|
state.forceList(*args[0], pos, "while evaluating the first argument passed to builtins.replaceStrings");
|
||||||
state.forceList(*args[1], pos, "while evaluating the second argument passed to builtins.replaceStrings");
|
state.forceList(*args[1], pos, "while evaluating the second argument passed to builtins.replaceStrings");
|
||||||
if (args[0]->listSize() != args[1]->listSize())
|
if (args[0]->listSize() != args[1]->listSize())
|
||||||
state.debugThrowLastTrace(EvalError({
|
state.error("'from' and 'to' arguments passed to builtins.replaceStrings have different lengths").atPos(pos).debugThrow<EvalError>();
|
||||||
.msg = hintfmt("'from' and 'to' arguments to 'replaceStrings' have different lengths"),
|
|
||||||
.errPos = state.positions[pos]
|
|
||||||
}));
|
|
||||||
|
|
||||||
std::vector<std::string> from;
|
std::vector<std::string> from;
|
||||||
from.reserve(args[0]->listSize());
|
from.reserve(args[0]->listSize());
|
||||||
for (auto elem : args[0]->listItems())
|
for (auto elem : args[0]->listItems())
|
||||||
from.emplace_back(state.forceString(*elem, pos, "while evaluating one of the strings to replace in builtins.replaceStrings"));
|
from.emplace_back(state.forceString(*elem, pos, "while evaluating one of the strings to replace passed to builtins.replaceStrings"));
|
||||||
|
|
||||||
std::vector<std::pair<std::string, PathSet>> to;
|
std::vector<std::pair<std::string, PathSet>> to;
|
||||||
to.reserve(args[1]->listSize());
|
to.reserve(args[1]->listSize());
|
||||||
for (auto elem : args[1]->listItems()) {
|
for (auto elem : args[1]->listItems()) {
|
||||||
PathSet ctx;
|
PathSet ctx;
|
||||||
auto s = state.forceString(*elem, ctx, pos, "while evaluating one of the replacement strings of builtins.replaceStrings");
|
auto s = state.forceString(*elem, ctx, pos, "while evaluating one of the replacement strings passed to builtins.replaceStrings");
|
||||||
to.emplace_back(s, std::move(ctx));
|
to.emplace_back(s, std::move(ctx));
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
|
@ -83,15 +83,13 @@ static void prim_getContext(EvalState & state, const PosIdx pos, Value * * args,
|
||||||
state.forceString(*args[0], context, pos, "while evaluating the argument passed to builtins.getContext");
|
state.forceString(*args[0], context, pos, "while evaluating the argument passed to builtins.getContext");
|
||||||
auto contextInfos = std::map<StorePath, ContextInfo>();
|
auto contextInfos = std::map<StorePath, ContextInfo>();
|
||||||
for (const auto & p : context) {
|
for (const auto & p : context) {
|
||||||
Path drv;
|
|
||||||
std::string output;
|
|
||||||
NixStringContextElem ctx = NixStringContextElem::parse(*state.store, p);
|
NixStringContextElem ctx = NixStringContextElem::parse(*state.store, p);
|
||||||
std::visit(overloaded {
|
std::visit(overloaded {
|
||||||
[&](NixStringContextElem::DrvDeep & d) {
|
[&](NixStringContextElem::DrvDeep & d) {
|
||||||
contextInfos[d.drvPath].allOutputs = true;
|
contextInfos[d.drvPath].allOutputs = true;
|
||||||
},
|
},
|
||||||
[&](NixStringContextElem::Built & b) {
|
[&](NixStringContextElem::Built & b) {
|
||||||
contextInfos[b.drvPath].outputs.emplace_back(std::move(output));
|
contextInfos[b.drvPath].outputs.emplace_back(std::move(b.output));
|
||||||
},
|
},
|
||||||
[&](NixStringContextElem::Opaque & o) {
|
[&](NixStringContextElem::Opaque & o) {
|
||||||
contextInfos[o.path].path = true;
|
contextInfos[o.path].path = true;
|
||||||
|
|
|
@ -22,7 +22,9 @@ static void prim_fetchMercurial(EvalState & state, const PosIdx pos, Value * * a
|
||||||
for (auto & attr : *args[0]->attrs) {
|
for (auto & attr : *args[0]->attrs) {
|
||||||
std::string_view n(state.symbols[attr.name]);
|
std::string_view n(state.symbols[attr.name]);
|
||||||
if (n == "url")
|
if (n == "url")
|
||||||
url = state.coerceToString(attr.pos, *attr.value, context, false, false, "while evaluating the `url` attribute passed to builtins.fetchMercurial").toOwned();
|
url = state.coerceToString(attr.pos, *attr.value, context,
|
||||||
|
"while evaluating the `url` attribute passed to builtins.fetchMercurial",
|
||||||
|
false, false).toOwned();
|
||||||
else if (n == "rev") {
|
else if (n == "rev") {
|
||||||
// Ugly: unlike fetchGit, here the "rev" attribute can
|
// Ugly: unlike fetchGit, here the "rev" attribute can
|
||||||
// be both a revision or a branch/tag name.
|
// be both a revision or a branch/tag name.
|
||||||
|
@ -48,7 +50,9 @@ static void prim_fetchMercurial(EvalState & state, const PosIdx pos, Value * * a
|
||||||
});
|
});
|
||||||
|
|
||||||
} else
|
} else
|
||||||
url = state.coerceToString(pos, *args[0], context, false, false, "while evaluating the first argument passed to builtins.fetchMercurial").toOwned();
|
url = state.coerceToString(pos, *args[0], context,
|
||||||
|
"while evaluating the first argument passed to builtins.fetchMercurial",
|
||||||
|
false, false).toOwned();
|
||||||
|
|
||||||
// FIXME: git externals probably can be used to bypass the URI
|
// FIXME: git externals probably can be used to bypass the URI
|
||||||
// whitelist. Ah well.
|
// whitelist. Ah well.
|
||||||
|
|
|
@ -125,7 +125,7 @@ static void fetchTree(
|
||||||
if (attr.name == state.sType) continue;
|
if (attr.name == state.sType) continue;
|
||||||
state.forceValue(*attr.value, attr.pos);
|
state.forceValue(*attr.value, attr.pos);
|
||||||
if (attr.value->type() == nPath || attr.value->type() == nString) {
|
if (attr.value->type() == nPath || attr.value->type() == nString) {
|
||||||
auto s = state.coerceToString(attr.pos, *attr.value, context, false, false, "").toOwned();
|
auto s = state.coerceToString(attr.pos, *attr.value, context, "", false, false).toOwned();
|
||||||
attrs.emplace(state.symbols[attr.name],
|
attrs.emplace(state.symbols[attr.name],
|
||||||
state.symbols[attr.name] == "url"
|
state.symbols[attr.name] == "url"
|
||||||
? type == "git"
|
? type == "git"
|
||||||
|
@ -151,7 +151,9 @@ static void fetchTree(
|
||||||
|
|
||||||
input = fetchers::Input::fromAttrs(std::move(attrs));
|
input = fetchers::Input::fromAttrs(std::move(attrs));
|
||||||
} else {
|
} else {
|
||||||
auto url = state.coerceToString(pos, *args[0], context, false, false, "while evaluating the first argument passed to the fetcher").toOwned();
|
auto url = state.coerceToString(pos, *args[0], context,
|
||||||
|
"while evaluating the first argument passed to the fetcher",
|
||||||
|
false, false).toOwned();
|
||||||
|
|
||||||
if (type == "git") {
|
if (type == "git") {
|
||||||
fetchers::Attrs attrs;
|
fetchers::Attrs attrs;
|
||||||
|
@ -218,6 +220,9 @@ static void fetch(EvalState & state, const PosIdx pos, Value * * args, Value & v
|
||||||
} else
|
} else
|
||||||
url = state.forceStringNoCtx(*args[0], pos, "while evaluating the url we should fetch");
|
url = state.forceStringNoCtx(*args[0], pos, "while evaluating the url we should fetch");
|
||||||
|
|
||||||
|
if (who == "fetchTarball")
|
||||||
|
url = evalSettings.resolvePseudoUrl(*url);
|
||||||
|
|
||||||
state.checkURI(*url);
|
state.checkURI(*url);
|
||||||
|
|
||||||
if (name == "")
|
if (name == "")
|
||||||
|
|
File diff suppressed because it is too large
Load diff
|
@ -89,7 +89,7 @@ class ExternalValueBase
|
||||||
/* Coerce the value to a string. Defaults to uncoercable, i.e. throws an
|
/* Coerce the value to a string. Defaults to uncoercable, i.e. throws an
|
||||||
* error.
|
* error.
|
||||||
*/
|
*/
|
||||||
virtual std::string coerceToString(const Pos & pos, PathSet & context, bool copyMore, bool copyToStore, std::string_view errorCtx) const;
|
virtual std::string coerceToString(const Pos & pos, PathSet & context, bool copyMore, bool copyToStore) const;
|
||||||
|
|
||||||
/* Compare to another value of the same type. Defaults to uncomparable,
|
/* Compare to another value of the same type. Defaults to uncomparable,
|
||||||
* i.e. always false.
|
* i.e. always false.
|
||||||
|
|
|
@ -72,16 +72,14 @@ DownloadFileResult downloadFile(
|
||||||
auto hash = hashString(htSHA256, res.data);
|
auto hash = hashString(htSHA256, res.data);
|
||||||
ValidPathInfo info {
|
ValidPathInfo info {
|
||||||
*store,
|
*store,
|
||||||
{
|
name,
|
||||||
.name = name,
|
FixedOutputInfo {
|
||||||
.info = FixedOutputInfo {
|
|
||||||
{
|
{
|
||||||
.method = FileIngestionMethod::Flat,
|
.method = FileIngestionMethod::Flat,
|
||||||
.hash = hash,
|
.hash = hash,
|
||||||
},
|
},
|
||||||
.references = {},
|
.references = {},
|
||||||
},
|
},
|
||||||
},
|
|
||||||
hashString(htSHA256, sink.s),
|
hashString(htSHA256, sink.s),
|
||||||
};
|
};
|
||||||
info.narSize = sink.s.size();
|
info.narSize = sink.s.size();
|
||||||
|
|
|
@ -307,9 +307,8 @@ StorePath BinaryCacheStore::addToStoreFromDump(Source & dump, std::string_view n
|
||||||
return addToStoreCommon(dump, repair, CheckSigs, [&](HashResult nar) {
|
return addToStoreCommon(dump, repair, CheckSigs, [&](HashResult nar) {
|
||||||
ValidPathInfo info {
|
ValidPathInfo info {
|
||||||
*this,
|
*this,
|
||||||
{
|
name,
|
||||||
.name = std::string { name },
|
FixedOutputInfo {
|
||||||
.info = FixedOutputInfo {
|
|
||||||
{
|
{
|
||||||
.method = method,
|
.method = method,
|
||||||
.hash = nar.first,
|
.hash = nar.first,
|
||||||
|
@ -319,7 +318,6 @@ StorePath BinaryCacheStore::addToStoreFromDump(Source & dump, std::string_view n
|
||||||
.self = false,
|
.self = false,
|
||||||
},
|
},
|
||||||
},
|
},
|
||||||
},
|
|
||||||
nar.first,
|
nar.first,
|
||||||
};
|
};
|
||||||
info.narSize = nar.second;
|
info.narSize = nar.second;
|
||||||
|
@ -427,9 +425,8 @@ StorePath BinaryCacheStore::addToStore(
|
||||||
return addToStoreCommon(*source, repair, CheckSigs, [&](HashResult nar) {
|
return addToStoreCommon(*source, repair, CheckSigs, [&](HashResult nar) {
|
||||||
ValidPathInfo info {
|
ValidPathInfo info {
|
||||||
*this,
|
*this,
|
||||||
{
|
name,
|
||||||
.name = std::string { name },
|
FixedOutputInfo {
|
||||||
.info = FixedOutputInfo {
|
|
||||||
{
|
{
|
||||||
.method = method,
|
.method = method,
|
||||||
.hash = h,
|
.hash = h,
|
||||||
|
@ -439,7 +436,6 @@ StorePath BinaryCacheStore::addToStore(
|
||||||
.self = false,
|
.self = false,
|
||||||
},
|
},
|
||||||
},
|
},
|
||||||
},
|
|
||||||
nar.first,
|
nar.first,
|
||||||
};
|
};
|
||||||
info.narSize = nar.second;
|
info.narSize = nar.second;
|
||||||
|
@ -465,13 +461,11 @@ StorePath BinaryCacheStore::addTextToStore(
|
||||||
return addToStoreCommon(source, repair, CheckSigs, [&](HashResult nar) {
|
return addToStoreCommon(source, repair, CheckSigs, [&](HashResult nar) {
|
||||||
ValidPathInfo info {
|
ValidPathInfo info {
|
||||||
*this,
|
*this,
|
||||||
{
|
std::string { name },
|
||||||
.name = std::string { name },
|
TextInfo {
|
||||||
.info = TextInfo {
|
|
||||||
{ .hash = textHash },
|
{ .hash = textHash },
|
||||||
references,
|
references,
|
||||||
},
|
},
|
||||||
},
|
|
||||||
nar.first,
|
nar.first,
|
||||||
};
|
};
|
||||||
info.narSize = nar.second;
|
info.narSize = nar.second;
|
||||||
|
|
|
@ -2484,13 +2484,11 @@ DrvOutputs LocalDerivationGoal::registerOutputs()
|
||||||
auto got = caSink.finish().first;
|
auto got = caSink.finish().first;
|
||||||
ValidPathInfo newInfo0 {
|
ValidPathInfo newInfo0 {
|
||||||
worker.store,
|
worker.store,
|
||||||
{
|
outputPathName(drv->name, outputName),
|
||||||
.name = outputPathName(drv->name, outputName),
|
contentAddressFromMethodHashAndRefs(
|
||||||
.info = contentAddressFromMethodHashAndRefs(
|
|
||||||
outputHash.method,
|
outputHash.method,
|
||||||
std::move(got),
|
std::move(got),
|
||||||
rewriteRefs()),
|
rewriteRefs()),
|
||||||
},
|
|
||||||
Hash::dummy,
|
Hash::dummy,
|
||||||
};
|
};
|
||||||
if (*scratchPath != newInfo0.path) {
|
if (*scratchPath != newInfo0.path) {
|
||||||
|
|
|
@ -95,10 +95,9 @@ void PathSubstitutionGoal::tryNext()
|
||||||
subs.pop_front();
|
subs.pop_front();
|
||||||
|
|
||||||
if (ca) {
|
if (ca) {
|
||||||
subPath = sub->makeFixedOutputPathFromCA({
|
subPath = sub->makeFixedOutputPathFromCA(
|
||||||
.name = std::string { storePath.name() },
|
std::string { storePath.name() },
|
||||||
.info = caWithoutRefs(*ca),
|
caWithoutRefs(*ca));
|
||||||
});
|
|
||||||
if (sub->storeDir == worker.store.storeDir)
|
if (sub->storeDir == worker.store.storeDir)
|
||||||
assert(subPath == storePath);
|
assert(subPath == storePath);
|
||||||
} else if (sub->storeDir != worker.store.storeDir) {
|
} else if (sub->storeDir != worker.store.storeDir) {
|
||||||
|
|
|
@ -147,11 +147,4 @@ Hash getContentAddressHash(const ContentAddressWithReferences & ca);
|
||||||
|
|
||||||
std::string printMethodAlgo(const ContentAddressWithReferences &);
|
std::string printMethodAlgo(const ContentAddressWithReferences &);
|
||||||
|
|
||||||
struct StorePathDescriptor {
|
|
||||||
std::string name;
|
|
||||||
ContentAddressWithReferences info;
|
|
||||||
|
|
||||||
GENERATE_CMP(StorePathDescriptor, me->name, me->info);
|
|
||||||
};
|
|
||||||
|
|
||||||
}
|
}
|
||||||
|
|
|
@ -35,10 +35,9 @@ std::optional<StorePath> DerivationOutput::path(const Store & store, std::string
|
||||||
|
|
||||||
StorePath DerivationOutput::CAFixed::path(const Store & store, std::string_view drvName, std::string_view outputName) const
|
StorePath DerivationOutput::CAFixed::path(const Store & store, std::string_view drvName, std::string_view outputName) const
|
||||||
{
|
{
|
||||||
return store.makeFixedOutputPathFromCA(StorePathDescriptor {
|
return store.makeFixedOutputPathFromCA(
|
||||||
.name = outputPathName(drvName, outputName),
|
outputPathName(drvName, outputName),
|
||||||
.info = ca,
|
ca);
|
||||||
});
|
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
|
|
|
@ -13,6 +13,7 @@
|
||||||
|
|
||||||
namespace nix {
|
namespace nix {
|
||||||
|
|
||||||
|
class Store;
|
||||||
|
|
||||||
/* Abstract syntax of derivations. */
|
/* Abstract syntax of derivations. */
|
||||||
|
|
||||||
|
|
|
@ -1136,10 +1136,9 @@ void LocalStore::querySubstitutablePathInfos(const StorePathCAMap & paths, Subst
|
||||||
|
|
||||||
// Recompute store path so that we can use a different store root.
|
// Recompute store path so that we can use a different store root.
|
||||||
if (path.second) {
|
if (path.second) {
|
||||||
subPath = makeFixedOutputPathFromCA({
|
subPath = makeFixedOutputPathFromCA(
|
||||||
.name = std::string { path.first.name() },
|
path.first.name(),
|
||||||
.info = caWithoutRefs(*path.second),
|
caWithoutRefs(*path.second));
|
||||||
});
|
|
||||||
if (sub->storeDir == storeDir)
|
if (sub->storeDir == storeDir)
|
||||||
assert(subPath == path.first);
|
assert(subPath == path.first);
|
||||||
if (subPath != path.first)
|
if (subPath != path.first)
|
||||||
|
@ -1417,9 +1416,7 @@ StorePath LocalStore::addToStoreFromDump(Source & source0, std::string_view name
|
||||||
|
|
||||||
auto [hash, size] = hashSink->finish();
|
auto [hash, size] = hashSink->finish();
|
||||||
|
|
||||||
auto desc = StorePathDescriptor {
|
ContentAddressWithReferences desc = FixedOutputInfo {
|
||||||
std::string { name },
|
|
||||||
FixedOutputInfo {
|
|
||||||
{
|
{
|
||||||
.method = method,
|
.method = method,
|
||||||
.hash = hash,
|
.hash = hash,
|
||||||
|
@ -1428,10 +1425,9 @@ StorePath LocalStore::addToStoreFromDump(Source & source0, std::string_view name
|
||||||
.others = references,
|
.others = references,
|
||||||
.self = false,
|
.self = false,
|
||||||
},
|
},
|
||||||
},
|
|
||||||
};
|
};
|
||||||
|
|
||||||
auto dstPath = makeFixedOutputPathFromCA(desc);
|
auto dstPath = makeFixedOutputPathFromCA(name, desc);
|
||||||
|
|
||||||
addTempRoot(dstPath);
|
addTempRoot(dstPath);
|
||||||
|
|
||||||
|
@ -1475,7 +1471,12 @@ StorePath LocalStore::addToStoreFromDump(Source & source0, std::string_view name
|
||||||
|
|
||||||
optimisePath(realPath, repair);
|
optimisePath(realPath, repair);
|
||||||
|
|
||||||
ValidPathInfo info { *this, std::move(desc), narHash.first };
|
ValidPathInfo info {
|
||||||
|
*this,
|
||||||
|
name,
|
||||||
|
std::move(desc),
|
||||||
|
narHash.first
|
||||||
|
};
|
||||||
info.narSize = narHash.second;
|
info.narSize = narHash.second;
|
||||||
registerValidPath(info);
|
registerValidPath(info);
|
||||||
}
|
}
|
||||||
|
|
|
@ -49,16 +49,14 @@ std::map<StorePath, StorePath> makeContentAddressed(
|
||||||
|
|
||||||
ValidPathInfo info {
|
ValidPathInfo info {
|
||||||
dstStore,
|
dstStore,
|
||||||
StorePathDescriptor {
|
path.name(),
|
||||||
.name = std::string { path.name() },
|
FixedOutputInfo {
|
||||||
.info = FixedOutputInfo {
|
|
||||||
{
|
{
|
||||||
.method = FileIngestionMethod::Recursive,
|
.method = FileIngestionMethod::Recursive,
|
||||||
.hash = narModuloHash,
|
.hash = narModuloHash,
|
||||||
},
|
},
|
||||||
.references = std::move(refs),
|
.references = std::move(refs),
|
||||||
},
|
},
|
||||||
},
|
|
||||||
Hash::dummy,
|
Hash::dummy,
|
||||||
};
|
};
|
||||||
|
|
||||||
|
|
|
@ -331,7 +331,7 @@ OutputPathMap resolveDerivedPath(Store & store, const DerivedPath::Built & bfd,
|
||||||
[&](const OutputsSpec::Names & names) {
|
[&](const OutputsSpec::Names & names) {
|
||||||
return static_cast<std::set<std::string>>(names);
|
return static_cast<std::set<std::string>>(names);
|
||||||
},
|
},
|
||||||
}, bfd.outputs);
|
}, bfd.outputs.raw());
|
||||||
for (auto & output : outputNames) {
|
for (auto & output : outputNames) {
|
||||||
auto outputHash = get(outputHashes, output);
|
auto outputHash = get(outputHashes, output);
|
||||||
if (!outputHash)
|
if (!outputHash)
|
||||||
|
|
|
@ -16,8 +16,8 @@ struct NarInfo : ValidPathInfo
|
||||||
uint64_t fileSize = 0;
|
uint64_t fileSize = 0;
|
||||||
|
|
||||||
NarInfo() = delete;
|
NarInfo() = delete;
|
||||||
NarInfo(const Store & store, StorePathDescriptor && ca, Hash narHash)
|
NarInfo(const Store & store, std::string && name, ContentAddressWithReferences && ca, Hash narHash)
|
||||||
: ValidPathInfo(store, std::move(ca), narHash)
|
: ValidPathInfo(store, std::move(name), std::move(ca), narHash)
|
||||||
{ }
|
{ }
|
||||||
NarInfo(StorePath && path, Hash narHash) : ValidPathInfo(std::move(path), narHash) { }
|
NarInfo(StorePath && path, Hash narHash) : ValidPathInfo(std::move(path), narHash) { }
|
||||||
NarInfo(const ValidPathInfo & info) : ValidPathInfo(info) { }
|
NarInfo(const ValidPathInfo & info) : ValidPathInfo(info) { }
|
||||||
|
|
|
@ -1,8 +1,10 @@
|
||||||
#include "util.hh"
|
|
||||||
#include "outputs-spec.hh"
|
|
||||||
#include "nlohmann/json.hpp"
|
|
||||||
|
|
||||||
#include <regex>
|
#include <regex>
|
||||||
|
#include <nlohmann/json.hpp>
|
||||||
|
|
||||||
|
#include "util.hh"
|
||||||
|
#include "regex-combinators.hh"
|
||||||
|
#include "outputs-spec.hh"
|
||||||
|
#include "path-regex.hh"
|
||||||
|
|
||||||
namespace nix {
|
namespace nix {
|
||||||
|
|
||||||
|
@ -18,10 +20,14 @@ bool OutputsSpec::contains(const std::string & outputName) const
|
||||||
}, raw());
|
}, raw());
|
||||||
}
|
}
|
||||||
|
|
||||||
|
static std::string outputSpecRegexStr =
|
||||||
|
regex::either(
|
||||||
|
regex::group(R"(\*)"),
|
||||||
|
regex::group(regex::list(nameRegexStr)));
|
||||||
|
|
||||||
std::optional<OutputsSpec> OutputsSpec::parseOpt(std::string_view s)
|
std::optional<OutputsSpec> OutputsSpec::parseOpt(std::string_view s)
|
||||||
{
|
{
|
||||||
static std::regex regex(R"((\*)|([a-z]+(,[a-z]+)*))");
|
static std::regex regex(std::string { outputSpecRegexStr });
|
||||||
|
|
||||||
std::smatch match;
|
std::smatch match;
|
||||||
std::string s2 { s }; // until some improves std::regex
|
std::string s2 { s }; // until some improves std::regex
|
||||||
|
@ -42,7 +48,7 @@ OutputsSpec OutputsSpec::parse(std::string_view s)
|
||||||
{
|
{
|
||||||
std::optional spec = parseOpt(s);
|
std::optional spec = parseOpt(s);
|
||||||
if (!spec)
|
if (!spec)
|
||||||
throw Error("Invalid outputs specifier: '%s'", s);
|
throw Error("invalid outputs specifier '%s'", s);
|
||||||
return *spec;
|
return *spec;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -65,7 +71,7 @@ std::pair<std::string_view, ExtendedOutputsSpec> ExtendedOutputsSpec::parse(std:
|
||||||
{
|
{
|
||||||
std::optional spec = parseOpt(s);
|
std::optional spec = parseOpt(s);
|
||||||
if (!spec)
|
if (!spec)
|
||||||
throw Error("Invalid extended outputs specifier: '%s'", s);
|
throw Error("invalid extended outputs specifier '%s'", s);
|
||||||
return *spec;
|
return *spec;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -163,7 +169,7 @@ void adl_serializer<OutputsSpec>::to_json(json & json, OutputsSpec t) {
|
||||||
[&](const OutputsSpec::Names & names) {
|
[&](const OutputsSpec::Names & names) {
|
||||||
json = names;
|
json = names;
|
||||||
},
|
},
|
||||||
}, t);
|
}, t.raw());
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@ -183,7 +189,7 @@ void adl_serializer<ExtendedOutputsSpec>::to_json(json & json, ExtendedOutputsSp
|
||||||
[&](const ExtendedOutputsSpec::Explicit & e) {
|
[&](const ExtendedOutputsSpec::Explicit & e) {
|
||||||
adl_serializer<OutputsSpec>::to_json(json, e);
|
adl_serializer<OutputsSpec>::to_json(json, e);
|
||||||
},
|
},
|
||||||
}, t);
|
}, t.raw());
|
||||||
}
|
}
|
||||||
|
|
||||||
}
|
}
|
||||||
|
|
|
@ -21,14 +21,12 @@ void ValidPathInfo::sign(const Store & store, const SecretKey & secretKey)
|
||||||
sigs.insert(secretKey.signDetached(fingerprint(store)));
|
sigs.insert(secretKey.signDetached(fingerprint(store)));
|
||||||
}
|
}
|
||||||
|
|
||||||
std::optional<StorePathDescriptor> ValidPathInfo::fullStorePathDescriptorOpt() const
|
std::optional<ContentAddressWithReferences> ValidPathInfo::contentAddressWithReferenences() const
|
||||||
{
|
{
|
||||||
if (! ca)
|
if (! ca)
|
||||||
return std::nullopt;
|
return std::nullopt;
|
||||||
|
|
||||||
return StorePathDescriptor {
|
return std::visit(overloaded {
|
||||||
.name = std::string { path.name() },
|
|
||||||
.info = std::visit(overloaded {
|
|
||||||
[&](const TextHash & th) -> ContentAddressWithReferences {
|
[&](const TextHash & th) -> ContentAddressWithReferences {
|
||||||
assert(references.count(path) == 0);
|
assert(references.count(path) == 0);
|
||||||
return TextInfo {
|
return TextInfo {
|
||||||
|
@ -51,18 +49,17 @@ std::optional<StorePathDescriptor> ValidPathInfo::fullStorePathDescriptorOpt() c
|
||||||
},
|
},
|
||||||
};
|
};
|
||||||
},
|
},
|
||||||
}, *ca),
|
}, *ca);
|
||||||
};
|
|
||||||
}
|
}
|
||||||
|
|
||||||
bool ValidPathInfo::isContentAddressed(const Store & store) const
|
bool ValidPathInfo::isContentAddressed(const Store & store) const
|
||||||
{
|
{
|
||||||
auto fullCaOpt = fullStorePathDescriptorOpt();
|
auto fullCaOpt = contentAddressWithReferenences();
|
||||||
|
|
||||||
if (! fullCaOpt)
|
if (! fullCaOpt)
|
||||||
return false;
|
return false;
|
||||||
|
|
||||||
auto caPath = store.makeFixedOutputPathFromCA(*fullCaOpt);
|
auto caPath = store.makeFixedOutputPathFromCA(path.name(), *fullCaOpt);
|
||||||
|
|
||||||
bool res = caPath == path;
|
bool res = caPath == path;
|
||||||
|
|
||||||
|
@ -102,9 +99,10 @@ Strings ValidPathInfo::shortRefs() const
|
||||||
|
|
||||||
ValidPathInfo::ValidPathInfo(
|
ValidPathInfo::ValidPathInfo(
|
||||||
const Store & store,
|
const Store & store,
|
||||||
StorePathDescriptor && info,
|
std::string_view name,
|
||||||
|
ContentAddressWithReferences && ca,
|
||||||
Hash narHash)
|
Hash narHash)
|
||||||
: path(store.makeFixedOutputPathFromCA(info))
|
: path(store.makeFixedOutputPathFromCA(name, ca))
|
||||||
, narHash(narHash)
|
, narHash(narHash)
|
||||||
{
|
{
|
||||||
std::visit(overloaded {
|
std::visit(overloaded {
|
||||||
|
@ -118,7 +116,7 @@ ValidPathInfo::ValidPathInfo(
|
||||||
this->references.insert(path);
|
this->references.insert(path);
|
||||||
this->ca = std::move((FixedOutputHash &&) foi);
|
this->ca = std::move((FixedOutputHash &&) foi);
|
||||||
},
|
},
|
||||||
}, std::move(info.info));
|
}, std::move(ca));
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
|
|
|
@ -77,7 +77,7 @@ struct ValidPathInfo
|
||||||
|
|
||||||
void sign(const Store & store, const SecretKey & secretKey);
|
void sign(const Store & store, const SecretKey & secretKey);
|
||||||
|
|
||||||
std::optional<StorePathDescriptor> fullStorePathDescriptorOpt() const;
|
std::optional<ContentAddressWithReferences> contentAddressWithReferenences() const;
|
||||||
|
|
||||||
/* Return true iff the path is verifiably content-addressed. */
|
/* Return true iff the path is verifiably content-addressed. */
|
||||||
bool isContentAddressed(const Store & store) const;
|
bool isContentAddressed(const Store & store) const;
|
||||||
|
@ -100,7 +100,7 @@ struct ValidPathInfo
|
||||||
ValidPathInfo(const StorePath & path, Hash narHash) : path(path), narHash(narHash) { };
|
ValidPathInfo(const StorePath & path, Hash narHash) : path(path), narHash(narHash) { };
|
||||||
|
|
||||||
ValidPathInfo(const Store & store,
|
ValidPathInfo(const Store & store,
|
||||||
StorePathDescriptor && ca, Hash narHash);
|
std::string_view name, ContentAddressWithReferences && ca, Hash narHash);
|
||||||
|
|
||||||
virtual ~ValidPathInfo() { }
|
virtual ~ValidPathInfo() { }
|
||||||
|
|
||||||
|
|
7
src/libstore/path-regex.hh
Normal file
7
src/libstore/path-regex.hh
Normal file
|
@ -0,0 +1,7 @@
|
||||||
|
#pragma once
|
||||||
|
|
||||||
|
namespace nix {
|
||||||
|
|
||||||
|
static constexpr std::string_view nameRegexStr = R"([0-9a-zA-Z\+\-\._\?=]+)";
|
||||||
|
|
||||||
|
}
|
|
@ -8,8 +8,10 @@ static void checkName(std::string_view path, std::string_view name)
|
||||||
{
|
{
|
||||||
if (name.empty())
|
if (name.empty())
|
||||||
throw BadStorePath("store path '%s' has an empty name", path);
|
throw BadStorePath("store path '%s' has an empty name", path);
|
||||||
if (name.size() > 211)
|
if (name.size() > StorePath::MaxPathLen)
|
||||||
throw BadStorePath("store path '%s' has a name longer than 211 characters", path);
|
throw BadStorePath("store path '%s' has a name longer than '%d characters",
|
||||||
|
StorePath::MaxPathLen, path);
|
||||||
|
// See nameRegexStr for the definition
|
||||||
for (auto c : name)
|
for (auto c : name)
|
||||||
if (!((c >= '0' && c <= '9')
|
if (!((c >= '0' && c <= '9')
|
||||||
|| (c >= 'a' && c <= 'z')
|
|| (c >= 'a' && c <= 'z')
|
||||||
|
|
|
@ -6,7 +6,6 @@
|
||||||
|
|
||||||
namespace nix {
|
namespace nix {
|
||||||
|
|
||||||
class Store;
|
|
||||||
struct Hash;
|
struct Hash;
|
||||||
|
|
||||||
class StorePath
|
class StorePath
|
||||||
|
@ -18,6 +17,8 @@ public:
|
||||||
/* Size of the hash part of store paths, in base-32 characters. */
|
/* Size of the hash part of store paths, in base-32 characters. */
|
||||||
constexpr static size_t HashLen = 32; // i.e. 160 bits
|
constexpr static size_t HashLen = 32; // i.e. 160 bits
|
||||||
|
|
||||||
|
constexpr static size_t MaxPathLen = 211;
|
||||||
|
|
||||||
StorePath() = delete;
|
StorePath() = delete;
|
||||||
|
|
||||||
StorePath(std::string_view baseName);
|
StorePath(std::string_view baseName);
|
||||||
|
|
|
@ -10,6 +10,8 @@
|
||||||
|
|
||||||
namespace nix {
|
namespace nix {
|
||||||
|
|
||||||
|
class Store;
|
||||||
|
|
||||||
struct DrvOutput {
|
struct DrvOutput {
|
||||||
// The hash modulo of the derivation
|
// The hash modulo of the derivation
|
||||||
Hash drvHash;
|
Hash drvHash;
|
||||||
|
|
|
@ -209,17 +209,17 @@ StorePath Store::makeTextPath(std::string_view name, const TextInfo & info) cons
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
StorePath Store::makeFixedOutputPathFromCA(const StorePathDescriptor & desc) const
|
StorePath Store::makeFixedOutputPathFromCA(std::string_view name, const ContentAddressWithReferences & ca) const
|
||||||
{
|
{
|
||||||
// New template
|
// New template
|
||||||
return std::visit(overloaded {
|
return std::visit(overloaded {
|
||||||
[&](const TextInfo & ti) {
|
[&](const TextInfo & ti) {
|
||||||
return makeTextPath(desc.name, ti);
|
return makeTextPath(name, ti);
|
||||||
},
|
},
|
||||||
[&](const FixedOutputInfo & foi) {
|
[&](const FixedOutputInfo & foi) {
|
||||||
return makeFixedOutputPath(desc.name, foi);
|
return makeFixedOutputPath(name, foi);
|
||||||
}
|
}
|
||||||
}, desc.info);
|
}, ca);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@ -437,8 +437,7 @@ ValidPathInfo Store::addToStoreSlow(std::string_view name, const Path & srcPath,
|
||||||
|
|
||||||
ValidPathInfo info {
|
ValidPathInfo info {
|
||||||
*this,
|
*this,
|
||||||
StorePathDescriptor {
|
name,
|
||||||
std::string { name },
|
|
||||||
FixedOutputInfo {
|
FixedOutputInfo {
|
||||||
{
|
{
|
||||||
.method = method,
|
.method = method,
|
||||||
|
@ -446,7 +445,6 @@ ValidPathInfo Store::addToStoreSlow(std::string_view name, const Path & srcPath,
|
||||||
},
|
},
|
||||||
.references = {},
|
.references = {},
|
||||||
},
|
},
|
||||||
},
|
|
||||||
narHash,
|
narHash,
|
||||||
};
|
};
|
||||||
info.narSize = narSize;
|
info.narSize = narSize;
|
||||||
|
@ -997,7 +995,8 @@ void copyStorePath(
|
||||||
if (info->ca && info->references.empty()) {
|
if (info->ca && info->references.empty()) {
|
||||||
auto info2 = make_ref<ValidPathInfo>(*info);
|
auto info2 = make_ref<ValidPathInfo>(*info);
|
||||||
info2->path = dstStore.makeFixedOutputPathFromCA(
|
info2->path = dstStore.makeFixedOutputPathFromCA(
|
||||||
info->fullStorePathDescriptorOpt().value());
|
info->path.name(),
|
||||||
|
info->contentAddressWithReferenences().value());
|
||||||
if (dstStore.storeDir == srcStore.storeDir)
|
if (dstStore.storeDir == srcStore.storeDir)
|
||||||
assert(info->path == info2->path);
|
assert(info->path == info2->path);
|
||||||
info = info2;
|
info = info2;
|
||||||
|
@ -1110,7 +1109,8 @@ std::map<StorePath, StorePath> copyPaths(
|
||||||
auto storePathForDst = storePathForSrc;
|
auto storePathForDst = storePathForSrc;
|
||||||
if (currentPathInfo.ca && currentPathInfo.references.empty()) {
|
if (currentPathInfo.ca && currentPathInfo.references.empty()) {
|
||||||
storePathForDst = dstStore.makeFixedOutputPathFromCA(
|
storePathForDst = dstStore.makeFixedOutputPathFromCA(
|
||||||
currentPathInfo.fullStorePathDescriptorOpt().value());
|
currentPathInfo.path.name(),
|
||||||
|
currentPathInfo.contentAddressWithReferenences().value());
|
||||||
if (dstStore.storeDir == srcStore.storeDir)
|
if (dstStore.storeDir == srcStore.storeDir)
|
||||||
assert(storePathForDst == storePathForSrc);
|
assert(storePathForDst == storePathForSrc);
|
||||||
if (storePathForDst != storePathForSrc)
|
if (storePathForDst != storePathForSrc)
|
||||||
|
|
|
@ -216,7 +216,7 @@ public:
|
||||||
|
|
||||||
StorePath makeTextPath(std::string_view name, const TextInfo & info) const;
|
StorePath makeTextPath(std::string_view name, const TextInfo & info) const;
|
||||||
|
|
||||||
StorePath makeFixedOutputPathFromCA(const StorePathDescriptor & info) const;
|
StorePath makeFixedOutputPathFromCA(std::string_view name, const ContentAddressWithReferences & ca) const;
|
||||||
|
|
||||||
/* This is the preparatory part of addToStore(); it computes the
|
/* This is the preparatory part of addToStore(); it computes the
|
||||||
store path to which srcPath is to be copied. Returns the store
|
store path to which srcPath is to be copied. Returns the store
|
||||||
|
|
23
src/libstore/tests/libstoretests.hh
Normal file
23
src/libstore/tests/libstoretests.hh
Normal file
|
@ -0,0 +1,23 @@
|
||||||
|
#include <gtest/gtest.h>
|
||||||
|
#include <gmock/gmock.h>
|
||||||
|
|
||||||
|
#include "store-api.hh"
|
||||||
|
|
||||||
|
namespace nix {
|
||||||
|
|
||||||
|
class LibStoreTest : public ::testing::Test {
|
||||||
|
public:
|
||||||
|
static void SetUpTestSuite() {
|
||||||
|
initLibStore();
|
||||||
|
}
|
||||||
|
|
||||||
|
protected:
|
||||||
|
LibStoreTest()
|
||||||
|
: store(openStore("dummy://"))
|
||||||
|
{ }
|
||||||
|
|
||||||
|
ref<Store> store;
|
||||||
|
};
|
||||||
|
|
||||||
|
|
||||||
|
} /* namespace nix */
|
|
@ -12,4 +12,4 @@ libstore-tests_CXXFLAGS += -I src/libstore -I src/libutil
|
||||||
|
|
||||||
libstore-tests_LIBS = libstore libutil
|
libstore-tests_LIBS = libstore libutil
|
||||||
|
|
||||||
libstore-tests_LDFLAGS := $(GTEST_LIBS)
|
libstore-tests_LDFLAGS := -lrapidcheck $(GTEST_LIBS)
|
||||||
|
|
|
@ -40,6 +40,20 @@ TEST(OutputsSpec, names_out) {
|
||||||
ASSERT_EQ(expected.to_string(), str);
|
ASSERT_EQ(expected.to_string(), str);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
TEST(OutputsSpec, names_underscore) {
|
||||||
|
std::string_view str = "a_b";
|
||||||
|
OutputsSpec expected = OutputsSpec::Names { "a_b" };
|
||||||
|
ASSERT_EQ(OutputsSpec::parse(str), expected);
|
||||||
|
ASSERT_EQ(expected.to_string(), str);
|
||||||
|
}
|
||||||
|
|
||||||
|
TEST(OutputsSpec, names_numberic) {
|
||||||
|
std::string_view str = "01";
|
||||||
|
OutputsSpec expected = OutputsSpec::Names { "01" };
|
||||||
|
ASSERT_EQ(OutputsSpec::parse(str), expected);
|
||||||
|
ASSERT_EQ(expected.to_string(), str);
|
||||||
|
}
|
||||||
|
|
||||||
TEST(OutputsSpec, names_out_bin) {
|
TEST(OutputsSpec, names_out_bin) {
|
||||||
OutputsSpec expected = OutputsSpec::Names { "out", "bin" };
|
OutputsSpec expected = OutputsSpec::Names { "out", "bin" };
|
||||||
ASSERT_EQ(OutputsSpec::parse("out,bin"), expected);
|
ASSERT_EQ(OutputsSpec::parse("out,bin"), expected);
|
||||||
|
|
144
src/libstore/tests/path.cc
Normal file
144
src/libstore/tests/path.cc
Normal file
|
@ -0,0 +1,144 @@
|
||||||
|
#include <regex>
|
||||||
|
|
||||||
|
#include <nlohmann/json.hpp>
|
||||||
|
#include <gtest/gtest.h>
|
||||||
|
#include <rapidcheck/gtest.h>
|
||||||
|
|
||||||
|
#include "path-regex.hh"
|
||||||
|
#include "store-api.hh"
|
||||||
|
|
||||||
|
#include "libstoretests.hh"
|
||||||
|
|
||||||
|
namespace nix {
|
||||||
|
|
||||||
|
#define STORE_DIR "/nix/store/"
|
||||||
|
#define HASH_PART "g1w7hy3qg1w7hy3qg1w7hy3qg1w7hy3q"
|
||||||
|
|
||||||
|
class StorePathTest : public LibStoreTest
|
||||||
|
{
|
||||||
|
};
|
||||||
|
|
||||||
|
static std::regex nameRegex { std::string { nameRegexStr } };
|
||||||
|
|
||||||
|
#define TEST_DONT_PARSE(NAME, STR) \
|
||||||
|
TEST_F(StorePathTest, bad_ ## NAME) { \
|
||||||
|
std::string_view str = \
|
||||||
|
STORE_DIR HASH_PART "-" STR; \
|
||||||
|
ASSERT_THROW( \
|
||||||
|
store->parseStorePath(str), \
|
||||||
|
BadStorePath); \
|
||||||
|
std::string name { STR }; \
|
||||||
|
EXPECT_FALSE(std::regex_match(name, nameRegex)); \
|
||||||
|
}
|
||||||
|
|
||||||
|
TEST_DONT_PARSE(empty, "")
|
||||||
|
TEST_DONT_PARSE(garbage, "&*()")
|
||||||
|
TEST_DONT_PARSE(double_star, "**")
|
||||||
|
TEST_DONT_PARSE(star_first, "*,foo")
|
||||||
|
TEST_DONT_PARSE(star_second, "foo,*")
|
||||||
|
TEST_DONT_PARSE(bang, "foo!o")
|
||||||
|
|
||||||
|
#undef TEST_DONT_PARSE
|
||||||
|
|
||||||
|
#define TEST_DO_PARSE(NAME, STR) \
|
||||||
|
TEST_F(StorePathTest, good_ ## NAME) { \
|
||||||
|
std::string_view str = \
|
||||||
|
STORE_DIR HASH_PART "-" STR; \
|
||||||
|
auto p = store->parseStorePath(str); \
|
||||||
|
std::string name { p.name() }; \
|
||||||
|
EXPECT_TRUE(std::regex_match(name, nameRegex)); \
|
||||||
|
}
|
||||||
|
|
||||||
|
// 0-9 a-z A-Z + - . _ ? =
|
||||||
|
|
||||||
|
TEST_DO_PARSE(numbers, "02345")
|
||||||
|
TEST_DO_PARSE(lower_case, "foo")
|
||||||
|
TEST_DO_PARSE(upper_case, "FOO")
|
||||||
|
TEST_DO_PARSE(plus, "foo+bar")
|
||||||
|
TEST_DO_PARSE(dash, "foo-dev")
|
||||||
|
TEST_DO_PARSE(underscore, "foo_bar")
|
||||||
|
TEST_DO_PARSE(period, "foo.txt")
|
||||||
|
TEST_DO_PARSE(question_mark, "foo?why")
|
||||||
|
TEST_DO_PARSE(equals_sign, "foo=foo")
|
||||||
|
|
||||||
|
#undef TEST_DO_PARSE
|
||||||
|
|
||||||
|
// For rapidcheck
|
||||||
|
void showValue(const StorePath & p, std::ostream & os) {
|
||||||
|
os << p.to_string();
|
||||||
|
}
|
||||||
|
|
||||||
|
}
|
||||||
|
|
||||||
|
namespace rc {
|
||||||
|
using namespace nix;
|
||||||
|
|
||||||
|
template<>
|
||||||
|
struct Arbitrary<StorePath> {
|
||||||
|
static Gen<StorePath> arbitrary();
|
||||||
|
};
|
||||||
|
|
||||||
|
Gen<StorePath> Arbitrary<StorePath>::arbitrary()
|
||||||
|
{
|
||||||
|
auto len = *gen::inRange<size_t>(1, StorePath::MaxPathLen);
|
||||||
|
|
||||||
|
std::string pre { HASH_PART "-" };
|
||||||
|
pre.reserve(pre.size() + len);
|
||||||
|
|
||||||
|
for (size_t c = 0; c < len; ++c) {
|
||||||
|
switch (auto i = *gen::inRange<uint8_t>(0, 10 + 2 * 26 + 6)) {
|
||||||
|
case 0 ... 9:
|
||||||
|
pre += '0' + i;
|
||||||
|
case 10 ... 35:
|
||||||
|
pre += 'A' + (i - 10);
|
||||||
|
break;
|
||||||
|
case 36 ... 61:
|
||||||
|
pre += 'a' + (i - 36);
|
||||||
|
break;
|
||||||
|
case 62:
|
||||||
|
pre += '+';
|
||||||
|
break;
|
||||||
|
case 63:
|
||||||
|
pre += '-';
|
||||||
|
break;
|
||||||
|
case 64:
|
||||||
|
pre += '.';
|
||||||
|
break;
|
||||||
|
case 65:
|
||||||
|
pre += '_';
|
||||||
|
break;
|
||||||
|
case 66:
|
||||||
|
pre += '?';
|
||||||
|
break;
|
||||||
|
case 67:
|
||||||
|
pre += '=';
|
||||||
|
break;
|
||||||
|
default:
|
||||||
|
assert(false);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return gen::just(StorePath { pre });
|
||||||
|
}
|
||||||
|
|
||||||
|
} // namespace rc
|
||||||
|
|
||||||
|
namespace nix {
|
||||||
|
|
||||||
|
RC_GTEST_FIXTURE_PROP(
|
||||||
|
StorePathTest,
|
||||||
|
prop_regex_accept,
|
||||||
|
(const StorePath & p))
|
||||||
|
{
|
||||||
|
RC_ASSERT(std::regex_match(std::string { p.name() }, nameRegex));
|
||||||
|
}
|
||||||
|
|
||||||
|
RC_GTEST_FIXTURE_PROP(
|
||||||
|
StorePathTest,
|
||||||
|
prop_round_rip,
|
||||||
|
(const StorePath & p))
|
||||||
|
{
|
||||||
|
RC_ASSERT(p == store->parseStorePath(store->printStorePath(p)));
|
||||||
|
}
|
||||||
|
|
||||||
|
}
|
|
@ -25,11 +25,17 @@ public:
|
||||||
/* Wait indefinitely until a POLLHUP occurs. */
|
/* Wait indefinitely until a POLLHUP occurs. */
|
||||||
struct pollfd fds[1];
|
struct pollfd fds[1];
|
||||||
fds[0].fd = fd;
|
fds[0].fd = fd;
|
||||||
/* This shouldn't be necessary, but macOS doesn't seem to
|
/* Polling for no specific events (i.e. just waiting
|
||||||
like a zeroed out events field.
|
for an error/hangup) doesn't work on macOS
|
||||||
See rdar://37537852.
|
anymore. So wait for read events and ignore
|
||||||
*/
|
them. */
|
||||||
fds[0].events = POLLHUP;
|
fds[0].events =
|
||||||
|
#ifdef __APPLE__
|
||||||
|
POLLRDNORM
|
||||||
|
#else
|
||||||
|
0
|
||||||
|
#endif
|
||||||
|
;
|
||||||
auto count = poll(fds, 1, -1);
|
auto count = poll(fds, 1, -1);
|
||||||
if (count == -1) abort(); // can't happen
|
if (count == -1) abort(); // can't happen
|
||||||
/* This shouldn't happen, but can on macOS due to a bug.
|
/* This shouldn't happen, but can on macOS due to a bug.
|
||||||
|
@ -40,10 +46,15 @@ public:
|
||||||
too harmful.
|
too harmful.
|
||||||
*/
|
*/
|
||||||
if (count == 0) continue;
|
if (count == 0) continue;
|
||||||
assert(fds[0].revents & POLLHUP);
|
if (fds[0].revents & POLLHUP) {
|
||||||
triggerInterrupt();
|
triggerInterrupt();
|
||||||
break;
|
break;
|
||||||
}
|
}
|
||||||
|
/* This will only happen on macOS. We sleep a bit to
|
||||||
|
avoid waking up too often if the client is sending
|
||||||
|
input. */
|
||||||
|
sleep(1);
|
||||||
|
}
|
||||||
});
|
});
|
||||||
};
|
};
|
||||||
|
|
||||||
|
|
30
src/libutil/regex-combinators.hh
Normal file
30
src/libutil/regex-combinators.hh
Normal file
|
@ -0,0 +1,30 @@
|
||||||
|
#pragma once
|
||||||
|
|
||||||
|
#include <string_view>
|
||||||
|
|
||||||
|
namespace nix::regex {
|
||||||
|
|
||||||
|
// TODO use constexpr string building like
|
||||||
|
// https://github.com/akrzemi1/static_string/blob/master/include/ak_toolkit/static_string.hpp
|
||||||
|
|
||||||
|
static inline std::string either(std::string_view a, std::string_view b)
|
||||||
|
{
|
||||||
|
return std::string { a } + "|" + b;
|
||||||
|
}
|
||||||
|
|
||||||
|
static inline std::string group(std::string_view a)
|
||||||
|
{
|
||||||
|
return std::string { "(" } + a + ")";
|
||||||
|
}
|
||||||
|
|
||||||
|
static inline std::string many(std::string_view a)
|
||||||
|
{
|
||||||
|
return std::string { "(?:" } + a + ")*";
|
||||||
|
}
|
||||||
|
|
||||||
|
static inline std::string list(std::string_view a)
|
||||||
|
{
|
||||||
|
return std::string { a } + many(group("," + a));
|
||||||
|
}
|
||||||
|
|
||||||
|
}
|
|
@ -43,16 +43,14 @@ struct CmdAddToStore : MixDryRun, StoreCommand
|
||||||
|
|
||||||
ValidPathInfo info {
|
ValidPathInfo info {
|
||||||
*store,
|
*store,
|
||||||
StorePathDescriptor {
|
std::move(*namePart),
|
||||||
.name = *namePart,
|
FixedOutputInfo {
|
||||||
.info = FixedOutputInfo {
|
|
||||||
{
|
{
|
||||||
.method = std::move(ingestionMethod),
|
.method = std::move(ingestionMethod),
|
||||||
.hash = std::move(hash),
|
.hash = std::move(hash),
|
||||||
},
|
},
|
||||||
.references = {},
|
.references = {},
|
||||||
},
|
},
|
||||||
},
|
|
||||||
narHash,
|
narHash,
|
||||||
};
|
};
|
||||||
info.narSize = sink.s.size();
|
info.narSize = sink.s.size();
|
||||||
|
|
|
@ -655,6 +655,19 @@ struct CmdFlakeCheck : FlakeCommand
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
else if (
|
||||||
|
name == "lib"
|
||||||
|
|| name == "darwinConfigurations"
|
||||||
|
|| name == "darwinModules"
|
||||||
|
|| name == "flakeModule"
|
||||||
|
|| name == "flakeModules"
|
||||||
|
|| name == "herculesCI"
|
||||||
|
|| name == "homeConfigurations"
|
||||||
|
|| name == "nixopsConfigurations"
|
||||||
|
)
|
||||||
|
// Known but unchecked community attribute
|
||||||
|
;
|
||||||
|
|
||||||
else
|
else
|
||||||
warn("unknown flake output '%s'", name);
|
warn("unknown flake output '%s'", name);
|
||||||
|
|
||||||
|
|
|
@ -200,7 +200,6 @@ struct ProfileManifest
|
||||||
|
|
||||||
ValidPathInfo info {
|
ValidPathInfo info {
|
||||||
*store,
|
*store,
|
||||||
StorePathDescriptor {
|
|
||||||
"profile",
|
"profile",
|
||||||
FixedOutputInfo {
|
FixedOutputInfo {
|
||||||
{
|
{
|
||||||
|
@ -212,7 +211,6 @@ struct ProfileManifest
|
||||||
.self = false,
|
.self = false,
|
||||||
},
|
},
|
||||||
},
|
},
|
||||||
},
|
|
||||||
narHash,
|
narHash,
|
||||||
};
|
};
|
||||||
info.narSize = sink.s.size();
|
info.narSize = sink.s.size();
|
||||||
|
|
|
@ -42,20 +42,21 @@ nix build -f multiple-outputs.nix --json 'a^*' --no-link | jq --exit-status '
|
||||||
nix build -f multiple-outputs.nix --json e --no-link | jq --exit-status '
|
nix build -f multiple-outputs.nix --json e --no-link | jq --exit-status '
|
||||||
(.[0] |
|
(.[0] |
|
||||||
(.drvPath | match(".*multiple-outputs-e.drv")) and
|
(.drvPath | match(".*multiple-outputs-e.drv")) and
|
||||||
(.outputs | keys == ["a", "b"]))
|
(.outputs | keys == ["a_a", "b"]))
|
||||||
'
|
'
|
||||||
|
|
||||||
# But not when it's overriden.
|
# But not when it's overriden.
|
||||||
nix build -f multiple-outputs.nix --json e^a --no-link | jq --exit-status '
|
nix build -f multiple-outputs.nix --json e^a_a --no-link
|
||||||
|
nix build -f multiple-outputs.nix --json e^a_a --no-link | jq --exit-status '
|
||||||
(.[0] |
|
(.[0] |
|
||||||
(.drvPath | match(".*multiple-outputs-e.drv")) and
|
(.drvPath | match(".*multiple-outputs-e.drv")) and
|
||||||
(.outputs | keys == ["a"]))
|
(.outputs | keys == ["a_a"]))
|
||||||
'
|
'
|
||||||
|
|
||||||
nix build -f multiple-outputs.nix --json 'e^*' --no-link | jq --exit-status '
|
nix build -f multiple-outputs.nix --json 'e^*' --no-link | jq --exit-status '
|
||||||
(.[0] |
|
(.[0] |
|
||||||
(.drvPath | match(".*multiple-outputs-e.drv")) and
|
(.drvPath | match(".*multiple-outputs-e.drv")) and
|
||||||
(.outputs | keys == ["a", "b", "c"]))
|
(.outputs | keys == ["a_a", "b", "c"]))
|
||||||
'
|
'
|
||||||
|
|
||||||
# Test building from raw store path to drv not expression.
|
# Test building from raw store path to drv not expression.
|
||||||
|
@ -104,7 +105,7 @@ nix build "$drv^*" --no-link --json | jq --exit-status '
|
||||||
nix build --impure -f multiple-outputs.nix --json e --no-link | jq --exit-status '
|
nix build --impure -f multiple-outputs.nix --json e --no-link | jq --exit-status '
|
||||||
(.[0] |
|
(.[0] |
|
||||||
(.drvPath | match(".*multiple-outputs-e.drv")) and
|
(.drvPath | match(".*multiple-outputs-e.drv")) and
|
||||||
(.outputs | keys == ["a", "b"]))
|
(.outputs | keys == ["a_a", "b"]))
|
||||||
'
|
'
|
||||||
|
|
||||||
testNormalization () {
|
testNormalization () {
|
||||||
|
|
|
@ -1 +1 @@
|
||||||
true
|
[ true true true true true true ]
|
||||||
|
|
|
@ -18,7 +18,24 @@ let
|
||||||
};
|
};
|
||||||
};
|
};
|
||||||
|
|
||||||
legit-context = builtins.getContext "${path}${drv.outPath}${drv.foo.outPath}${drv.drvPath}";
|
combo-path = "${path}${drv.outPath}${drv.foo.outPath}${drv.drvPath}";
|
||||||
|
legit-context = builtins.getContext combo-path;
|
||||||
|
|
||||||
constructed-context = builtins.getContext (builtins.appendContext "" desired-context);
|
reconstructed-path = builtins.appendContext
|
||||||
in legit-context == constructed-context
|
(builtins.unsafeDiscardStringContext combo-path)
|
||||||
|
desired-context;
|
||||||
|
|
||||||
|
# Eta rule for strings with context.
|
||||||
|
etaRule = str:
|
||||||
|
str == builtins.appendContext
|
||||||
|
(builtins.unsafeDiscardStringContext str)
|
||||||
|
(builtins.getContext str);
|
||||||
|
|
||||||
|
in [
|
||||||
|
(legit-context == desired-context)
|
||||||
|
(reconstructed-path == combo-path)
|
||||||
|
(etaRule "foo")
|
||||||
|
(etaRule drv.drvPath)
|
||||||
|
(etaRule drv.foo.outPath)
|
||||||
|
(etaRule (builtins.unsafeDiscardOutputDependency drv.drvPath))
|
||||||
|
]
|
||||||
|
|
|
@ -1 +1 @@
|
||||||
{ bar = "regular"; foo = "directory"; }
|
{ bar = "regular"; foo = "directory"; ldir = "symlink"; linked = "symlink"; }
|
||||||
|
|
1
tests/lang/eval-okay-readFileType.exp
Normal file
1
tests/lang/eval-okay-readFileType.exp
Normal file
|
@ -0,0 +1 @@
|
||||||
|
{ bar = "regular"; foo = "directory"; ldir = "symlink"; linked = "symlink"; }
|
6
tests/lang/eval-okay-readFileType.nix
Normal file
6
tests/lang/eval-okay-readFileType.nix
Normal file
|
@ -0,0 +1,6 @@
|
||||||
|
{
|
||||||
|
bar = builtins.readFileType ./readDir/bar;
|
||||||
|
foo = builtins.readFileType ./readDir/foo;
|
||||||
|
linked = builtins.readFileType ./readDir/linked;
|
||||||
|
ldir = builtins.readFileType ./readDir/ldir;
|
||||||
|
}
|
1
tests/lang/readDir/ldir
Symbolic link
1
tests/lang/readDir/ldir
Symbolic link
|
@ -0,0 +1 @@
|
||||||
|
foo
|
1
tests/lang/readDir/linked
Symbolic link
1
tests/lang/readDir/linked
Symbolic link
|
@ -0,0 +1 @@
|
||||||
|
foo/git-hates-directories
|
|
@ -91,9 +91,9 @@ rec {
|
||||||
|
|
||||||
e = mkDerivation {
|
e = mkDerivation {
|
||||||
name = "multiple-outputs-e";
|
name = "multiple-outputs-e";
|
||||||
outputs = [ "a" "b" "c" ];
|
outputs = [ "a_a" "b" "c" ];
|
||||||
meta.outputsToInstall = [ "a" "b" ];
|
meta.outputsToInstall = [ "a_a" "b" ];
|
||||||
buildCommand = "mkdir $a $b $c";
|
buildCommand = "mkdir $a_a $b $c";
|
||||||
};
|
};
|
||||||
|
|
||||||
independent = mkDerivation {
|
independent = mkDerivation {
|
||||||
|
@ -117,4 +117,14 @@ rec {
|
||||||
'';
|
'';
|
||||||
};
|
};
|
||||||
|
|
||||||
|
invalid-output-name-1 = mkDerivation {
|
||||||
|
name = "invalid-output-name-1";
|
||||||
|
outputs = [ "out/"];
|
||||||
|
};
|
||||||
|
|
||||||
|
invalid-output-name-2 = mkDerivation {
|
||||||
|
name = "invalid-output-name-2";
|
||||||
|
outputs = [ "x" "foo$"];
|
||||||
|
};
|
||||||
|
|
||||||
}
|
}
|
||||||
|
|
|
@ -83,3 +83,6 @@ nix-store --gc --keep-derivations --keep-outputs
|
||||||
nix-store --gc --print-roots
|
nix-store --gc --print-roots
|
||||||
rm -rf $NIX_STORE_DIR/.links
|
rm -rf $NIX_STORE_DIR/.links
|
||||||
rmdir $NIX_STORE_DIR
|
rmdir $NIX_STORE_DIR
|
||||||
|
|
||||||
|
nix build -f multiple-outputs.nix invalid-output-name-1 2>&1 | grep 'contains illegal character'
|
||||||
|
nix build -f multiple-outputs.nix invalid-output-name-2 2>&1 | grep 'contains illegal character'
|
||||||
|
|
Loading…
Reference in a new issue