diff --git a/doc/manual/src/language/operators.md b/doc/manual/src/language/operators.md
index e9cbb5c92..6fd66864b 100644
--- a/doc/manual/src/language/operators.md
+++ b/doc/manual/src/language/operators.md
@@ -84,7 +84,7 @@ The `+` operator is overloaded to also work on strings and paths.
>
> *string* `+` *string*
-Concatenate two [string]s and merge their string contexts.
+Concatenate two [strings][string] and merge their string contexts.
[String concatenation]: #string-concatenation
@@ -94,7 +94,7 @@ Concatenate two [string]s and merge their string contexts.
>
> *path* `+` *path*
-Concatenate two [path]s.
+Concatenate two [paths][path].
The result is a path.
[Path concatenation]: #path-concatenation
@@ -150,9 +150,9 @@ If an attribute name is present in both, the attribute value from the latter is
Comparison is
-- [arithmetic] for [number]s
-- lexicographic for [string]s and [path]s
-- item-wise lexicographic for [list]s:
+- [arithmetic] for [numbers][number]
+- lexicographic for [strings][string] and [paths][path]
+- item-wise lexicographic for [lists][list]:
elements at the same index in both lists are compared according to their type and skipped if they are equal.
All comparison operators are implemented in terms of `<`, and the following equivalencies hold:
@@ -163,12 +163,12 @@ All comparison operators are implemented in terms of `<`, and the following equi
| *a* `>` *b* | *b* `<` *a* |
| *a* `>=` *b* | `! (` *a* `<` *b* `)` |
-[Comparison]: #comparison-operators
+[Comparison]: #comparison
## Equality
-- [Attribute sets][attribute set] and [list]s are compared recursively, and therefore are fully evaluated.
-- Comparison of [function]s always returns `false`.
+- [Attribute sets][attribute set] and [lists][list] are compared recursively, and therefore are fully evaluated.
+- Comparison of [functions][function] always returns `false`.
- Numbers are type-compatible, see [arithmetic] operators.
- Floating point numbers only differ up to a limited precision.
diff --git a/doc/manual/src/language/string-interpolation.md b/doc/manual/src/language/string-interpolation.md
index e999b287b..7d81c2020 100644
--- a/doc/manual/src/language/string-interpolation.md
+++ b/doc/manual/src/language/string-interpolation.md
@@ -20,6 +20,8 @@ Rather than writing
(where `freetype` is a [derivation]), you can instead write
+[derivation]: ../glossary.md#gloss-derivation
+
```nix
"--with-freetype2-library=${freetype}/lib"
```
@@ -189,7 +191,7 @@ If neither is present, an error is thrown.
> "${a}"
> ```
>
-> error: cannot coerce a set to a string
+> error: cannot coerce a set to a string: { }
>
> at «string»:4:2:
>
diff --git a/doc/manual/src/language/values.md b/doc/manual/src/language/values.md
index aea68a441..74ffc7070 100644
--- a/doc/manual/src/language/values.md
+++ b/doc/manual/src/language/values.md
@@ -156,6 +156,8 @@ function and the fifth being a set.
Note that lists are only lazy in values, and they are strict in length.
+Elements in a list can be accessed using [`builtins.elemAt`](./builtins.md#builtins-elemAt).
+
## Attribute Set
An attribute set is a collection of name-value-pairs (called *attributes*) enclosed in curly brackets (`{ }`).
diff --git a/doc/manual/src/protocols/json/derivation.md b/doc/manual/src/protocols/json/derivation.md
new file mode 100644
index 000000000..649d543cc
--- /dev/null
+++ b/doc/manual/src/protocols/json/derivation.md
@@ -0,0 +1,71 @@
+# Derivation JSON Format
+
+> **Warning**
+>
+> This JSON format is currently
+> [**experimental**](@docroot@/contributing/experimental-features.md#xp-feature-nix-command)
+> and subject to change.
+
+The JSON serialization of a
+[derivations](@docroot@/glossary.md#gloss-store-derivation)
+is a JSON object with the following fields:
+
+* `name`:
+ The name of the derivation.
+ This is used when calculating the store paths of the derivation's outputs.
+
+* `outputs`:
+ Information about the output paths of the derivation.
+ This is a JSON object with one member per output, where the key is the output name and the value is a JSON object with these fields:
+
+ * `path`: The output path.
+
+ * `hashAlgo`:
+ For fixed-output derivations, the hashing algorithm (e.g. `sha256`), optionally prefixed by `r:` if `hash` denotes a NAR hash rather than a flat file hash.
+
+ * `hash`:
+ For fixed-output derivations, the expected content hash in base-16.
+
+ > **Example**
+ >
+ > ```json
+ > "outputs": {
+ > "out": {
+ > "path": "/nix/store/2543j7c6jn75blc3drf4g5vhb1rhdq29-source",
+ > "hashAlgo": "r:sha256",
+ > "hash": "6fc80dcc62179dbc12fc0b5881275898f93444833d21b89dfe5f7fbcbb1d0d62"
+ > }
+ > }
+ > ```
+
+* `inputSrcs`:
+ A list of store paths on which this derivation depends.
+
+* `inputDrvs`:
+ A JSON object specifying the derivations on which this derivation depends, and what outputs of those derivations.
+
+ > **Example**
+ >
+ > ```json
+ > "inputDrvs": {
+ > "/nix/store/6lkh5yi7nlb7l6dr8fljlli5zfd9hq58-curl-7.73.0.drv": ["dev"],
+ > "/nix/store/fn3kgnfzl5dzym26j8g907gq3kbm8bfh-unzip-6.0.drv": ["out"]
+ > }
+ > ```
+
+ specifies that this derivation depends on the `dev` output of `curl`, and the `out` output of `unzip`.
+
+* `system`:
+ The system type on which this derivation is to be built
+ (e.g. `x86_64-linux`).
+
+* `builder`:
+ The absolute path of the program to be executed to run the build.
+ Typically this is the `bash` shell
+ (e.g. `/nix/store/r3j288vpmczbl500w6zz89gyfa4nr0b1-bash-4.4-p23/bin/bash`).
+
+* `args`:
+ The command-line arguments passed to the `builder`.
+
+* `env`:
+ The environment passed to the `builder`.
diff --git a/doc/manual/src/protocols/json/store-object-info.md b/doc/manual/src/protocols/json/store-object-info.md
new file mode 100644
index 000000000..ba4ab098f
--- /dev/null
+++ b/doc/manual/src/protocols/json/store-object-info.md
@@ -0,0 +1,98 @@
+# Store object info JSON format
+
+> **Warning**
+>
+> This JSON format is currently
+> [**experimental**](@docroot@/contributing/experimental-features.md#xp-feature-nix-command)
+> and subject to change.
+
+Info about a [store object].
+
+* `path`:
+
+ [Store path][store path] to the given store object.
+
+* `narHash`:
+
+ Hash of the [file system object] part of the store object when serialized as a [Nix Archive].
+
+* `narSize`:
+
+ Size of the [file system object] part of the store object when serialized as a [Nix Archive].
+
+* `references`:
+
+ An array of [store paths][store path], possibly including this one.
+
+* `ca` (optional):
+
+ Content address of this store object's file system object, used to compute its store path.
+
+[store path]: @docroot@/glossary.md#gloss-store-path
+[file system object]: @docroot@/store/file-system-object.md
+[Nix Archive]: @docroot@/glossary.md#gloss-nar
+
+## Impure fields
+
+These are not intrinsic properties of the store object.
+In other words, the same store object residing in different store could have different values for these properties.
+
+* `deriver` (optional):
+
+ The path to the [derivation] from which this store object is produced.
+
+ [derivation]: @docroot@/glossary.md#gloss-store-derivation
+
+* `registrationTime` (optional):
+
+ When this derivation was added to the store.
+
+* `ultimate` (optional):
+
+ Whether this store object is trusted because we built it ourselves, rather than substituted a build product from elsewhere.
+
+* `signatures` (optional):
+
+ Signatures claiming that this store object is what it claims to be.
+ Not relevant for [content-addressed] store objects,
+ but useful for [input-addressed] store objects.
+
+ [content-addressed]: @docroot@/glossary.md#gloss-content-addressed-store-object
+ [input-addressed]: @docroot@/glossary.md#gloss-input-addressed-store-object
+
+### `.narinfo` extra fields
+
+This meta data is specific to the "binary cache" family of Nix store types.
+This information is not intrinsic to the store object, but about how it is stored.
+
+* `url`:
+
+ Where to download a compressed archive of the file system objects of this store object.
+
+* `compression`:
+
+ The compression format that the archive is in.
+
+* `fileHash`:
+
+ A digest for the compressed archive itself, as opposed to the data contained within.
+
+* `fileSize`:
+
+ The size of the compressed archive itself.
+
+## Computed closure fields
+
+These fields are not stored at all, but computed by traverising the other other fields across all the store objects in a [closure].
+
+* `closureSize`:
+
+ The total size of the compressed archive itself for this object, and the compressed archive of every object in this object's [closure].
+
+### `.narinfo` extra fields
+
+* `closureSize`:
+
+ The total size of this store object and every other object in its [closure].
+
+[closure]: @docroot@/glossary.md#gloss-closure
diff --git a/doc/manual/src/protocols/store-path.md b/doc/manual/src/protocols/store-path.md
new file mode 100644
index 000000000..565c4fa75
--- /dev/null
+++ b/doc/manual/src/protocols/store-path.md
@@ -0,0 +1,131 @@
+# Complete Store Path Calculation
+
+This is the complete specification for how store paths are calculated.
+
+The format of this specification is close to [Extended Backus–Naur form](https://en.wikipedia.org/wiki/Extended_Backus%E2%80%93Naur_form), but must deviate for a few things such as hash functions which we treat as bidirectional for specification purposes.
+
+Regular users do *not* need to know this information --- store paths can be treated as black boxes computed from the properties of the store objects they refer to.
+But for those interested in exactly how Nix works, e.g. if they are reimplementing it, this information can be useful.
+
+## Store path proper
+
+```ebnf
+store-path = store-dir "/" digest "-" name
+```
+where
+
+- `name` = the name of the store object.
+
+- `store-dir` = the [store directory](@docroot@/store/store-path.md#store-directory)
+
+- `digest` = base-32 representation of the first 160 bits of a [SHA-256] hash of `fingerprint`
+
+ This the hash part of the store name
+
+## Fingerprint
+
+- ```ebnf
+ fingerprint = type ":" sha256 ":" inner-digest ":" store ":" name
+ ```
+
+ Note that it includes the location of the store as well as the name to make sure that changes to either of those are reflected in the hash
+ (e.g. you won't get `/nix/store/-name1` and `/nix/store/-name2`, or `/gnu/store/-name1`, with equal hash parts).
+
+- `type` = one of:
+
+ - ```ebnf
+ | "text" ( ":" store-path )*
+ ```
+
+ for encoded derivations written to the store.
+ The optional trailing store paths are the references of the store object.
+
+ - ```ebnf
+ | "source" ( ":" store-path )*
+ ```
+
+ For paths copied to the store and hashed via a [Nix Archive (NAR)] and [SHA-256][sha-256].
+ Just like in the text case, we can have the store objects referenced by their paths.
+ Additionally, we can have an optional `:self` label to denote self reference.
+
+ - ```ebnf
+ | "output:" id
+ ```
+
+ For either the outputs built from derivations,
+ paths copied to the store hashed that area single file hashed directly, or the via a hash algorithm other than [SHA-256][sha-256].
+ (in that case "source" is used; this is only necessary for compatibility).
+
+ `id` is the name of the output (usually, "out").
+ For content-addressed store objects, `id`, is always "out".
+
+- `inner-digest` = base-16 representation of a SHA-256 hash of `inner-fingerprint`
+
+## Inner fingerprint
+
+- `inner-fingerprint` = one of the following based on `type`:
+
+ - if `type` = `"text:" ...`:
+
+ the string written to the resulting store path.
+
+ - if `type` = `"source:" ...`:
+
+ the the hash of the [Nix Archive (NAR)] serialization of the [file system object](@docroot@/store/file-system-object.md) of the store object.
+
+ - if `type` = `"output:" id`:
+
+ - For input-addressed derivation outputs:
+
+ the [ATerm](@docroot@/protocols/derivation-aterm.md) serialization of the derivation modulo fixed output derivations.
+
+ - For content-addressed store paths:
+
+ ```ebnf
+ "fixed:out:" rec algo ":" hash ":"
+ ```
+
+ where
+
+ - `rec` = one of:
+
+ - ```ebnf
+ | ""
+ ```
+ (empty string) for hashes of the flat (single file) serialization
+
+ - ```ebnf
+ | "r:"
+ ```
+ hashes of the for [Nix Archive (NAR)] (arbitrary file system object) serialization
+
+ - ```ebnf
+ | "git:"
+ ```
+ hashes of the [Git blob/tree](https://git-scm.com/book/en/v2/Git-Internals-Git-Objects) [Merkel tree](https://en.wikipedia.org/wiki/Merkle_tree) format
+
+ - ```ebnf
+ algo = "md5" | "sha1" | "sha256"
+ ```
+
+ - `hash` = base-16 representation of the path or flat hash of the contents of the path (or expected contents of the path for fixed-output derivations).
+
+ Note that `id` = `"out"`, regardless of the name part of the store path.
+ Also note that NAR + SHA-256 must not use this case, and instead must use the `type` = `"source:" ...` case.
+
+[Nix Archive (NAR)]: @docroot@/glossary.md#gloss-NAR
+[sha-256]: https://en.m.wikipedia.org/wiki/SHA-256
+
+### Historical Note
+
+The `type` = `"source:" ...` and `type` = `"output:out"` grammars technically overlap in purpose,
+in that both can represent data hashed by its SHA-256 NAR serialization.
+
+The original reason for this way of computing names was to prevent name collisions (for security).
+For instance, the thinking was that it shouldn't be feasible to come up with a derivation whose output path collides with the path for a copied source.
+The former would have an `inner-fingerprint` starting with `output:out:`, while the latter would have an `inner-fingerprint` starting with `source:`.
+
+Since `64519cfd657d024ae6e2bb74cb21ad21b886fd2a` (2008), however, it was decided that separating derivation-produced vs manually-hashed content-addressed data like this was not useful.
+Now, data that is content-addressed with SHA-256 + NAR-serialization always uses the `source:...` construction, regardless of how it was produced (manually or by derivation).
+This allows freely switching between using [fixed-output derivations](@docroot@/glossary.md#gloss-fixed-output-derivation) for fetching, and fetching out-of-band and then manually adding.
+It also removes the ambiguity from the grammar.
diff --git a/doc/manual/src/quick-start.md b/doc/manual/src/quick-start.md
index 5f54abbde..9eb7a3265 100644
--- a/doc/manual/src/quick-start.md
+++ b/doc/manual/src/quick-start.md
@@ -1,99 +1,43 @@
# Quick Start
-This chapter is for impatient people who don't like reading
-documentation. For more in-depth information you are kindly referred
-to subsequent chapters.
+This chapter is for impatient people who don't like reading documentation.
+For more in-depth information you are kindly referred to subsequent chapters.
-1. Install Nix by running the following:
+1. Install Nix:
```console
$ curl -L https://nixos.org/nix/install | sh
```
The install script will use `sudo`, so make sure you have sufficient rights.
- On Linux, `--daemon` can be omitted for a single-user install.
- For other installation methods, see [here](installation/index.md).
+ For other installation methods, see the detailed [installation instructions](installation/index.md).
-1. See what installable packages are currently available in the
- channel:
+1. Run software without installing it permanently:
```console
- $ nix-env --query --available --attr-path
- nixpkgs.docbook_xml_dtd_43 docbook-xml-4.3
- nixpkgs.docbook_xml_dtd_45 docbook-xml-4.5
- nixpkgs.firefox firefox-33.0.2
- nixpkgs.hello hello-2.9
- nixpkgs.libxslt libxslt-1.1.28
- …
+ $ nix-shell --packages cowsay lolcat
```
-1. Install some packages from the channel:
+ This downloads the specified packages with all their dependencies, and drops you into a Bash shell where the commands provided by those packages are present.
+ This will not affect your normal environment:
```console
- $ nix-env --install --attr nixpkgs.hello
+ [nix-shell:~]$ cowsay Hello, Nix! | lolcat
```
- This should download pre-built packages; it should not build them
- locally (if it does, something went wrong).
-
-1. Test that they work:
+ Exiting the shell will make the programs disappear again:
```console
- $ which hello
- /home/eelco/.nix-profile/bin/hello
- $ hello
- Hello, world!
- ```
-
-1. Uninstall a package:
-
- ```console
- $ nix-env --uninstall hello
- ```
-
-1. You can also test a package without installing it:
-
- ```console
- $ nix-shell --packages hello
- ```
-
- This builds or downloads GNU Hello and its dependencies, then drops
- you into a Bash shell where the `hello` command is present, all
- without affecting your normal environment:
-
- ```console
- [nix-shell:~]$ hello
- Hello, world!
-
[nix-shell:~]$ exit
-
- $ hello
- hello: command not found
+ $ lolcat
+ lolcat: command not found
```
-1. To keep up-to-date with the channel, do:
+1. Search for more packages on [search.nixos.org](https://search.nixos.org/) to try them out.
+
+1. Free up storage space:
```console
- $ nix-channel --update nixpkgs
- $ nix-env --upgrade '*'
- ```
-
- The latter command will upgrade each installed package for which
- there is a “newer” version (as determined by comparing the version
- numbers).
-
-1. If you're unhappy with the result of a `nix-env` action (e.g., an
- upgraded package turned out not to work properly), you can go back:
-
- ```console
- $ nix-env --rollback
- ```
-
-1. You should periodically run the Nix garbage collector to get rid of
- unused packages, since uninstalls or upgrades don't actually delete
- them:
-
- ```console
- $ nix-collect-garbage --delete-old
+ $ nix-collect-garbage
```
diff --git a/doc/manual/src/release-notes/rl-2.20.md b/doc/manual/src/release-notes/rl-2.20.md
new file mode 100644
index 000000000..8ede168a4
--- /dev/null
+++ b/doc/manual/src/release-notes/rl-2.20.md
@@ -0,0 +1,202 @@
+# Release 2.20.0 (2024-01-29)
+
+- Option `allowed-uris` can now match whole schemes in URIs without slashes [#9547](https://github.com/NixOS/nix/pull/9547)
+
+ If a scheme, such as `github:` is specified in the `allowed-uris` option, all URIs starting with `github:` are allowed.
+ Previously this only worked for schemes whose URIs used the `://` syntax.
+
+- Include cgroup stats when building through the daemon [#9598](https://github.com/NixOS/nix/pull/9598)
+
+ Nix now also reports cgroup statistics when building through the Nix daemon and when doing remote builds using `ssh-ng`,
+ if both sides of the connection are using Nix 2.20 or newer.
+
+- Disallow empty search regex in `nix search` [#9481](https://github.com/NixOS/nix/pull/9481)
+
+ [`nix search`](@docroot@/command-ref/new-cli/nix3-search.md) now requires a search regex to be passed. To show all packages, use `^`.
+
+- Add new `eval-system` setting [#4093](https://github.com/NixOS/nix/pull/4093)
+
+ Add a new `eval-system` option.
+ Unlike `system`, it just overrides the value of `builtins.currentSystem`.
+ This is more useful than overriding `system`, because you can build these derivations on remote builders which can work on the given system.
+ In contrast, `system` also affects scheduling which will cause Nix to build those derivations locally even if that doesn't make sense.
+
+ `eval-system` only takes effect if it is non-empty.
+ If empty (the default) `system` is used as before, so there is no breakage.
+
+- Import-from-derivation builds the derivation in the build store [#9661](https://github.com/NixOS/nix/pull/9661)
+
+ When using `--eval-store`, `import`ing from a derivation will now result in the derivation being built on the build store, i.e. the store specified in the `store` Nix option.
+
+ Because the resulting Nix expression must be copied back to the evaluation store in order to be imported, this requires the evaluation store to trust the build store's signatures.
+
+- Mounted SSH Store [#7890](https://github.com/NixOS/nix/issues/7890) [#7912](https://github.com/NixOS/nix/pull/7912)
+
+ Introduced the store [`mounted-ssh-ng://`](@docroot@/command-ref/new-cli/nix3-help-stores.md).
+ This store allows full access to a Nix store on a remote machine and additionally requires that the store be mounted in the local filesystem.
+
+- Rename `nix show-config` to `nix config show` [#7672](https://github.com/NixOS/nix/issues/7672) [#9477](https://github.com/NixOS/nix/pull/9477)
+
+ `nix show-config` was renamed to `nix config show`, and `nix doctor` was renamed to `nix config check`, to be more consistent with the rest of the command line interface.
+
+- Add command `nix hash convert` [#9452](https://github.com/NixOS/nix/pull/9452)
+
+ This replaces the old `nix hash to-*` commands, which are still available but will emit a deprecation warning. Please convert as follows:
+
+ - `nix hash to-base16 $hash1 $hash2`: Use `nix hash convert --to base16 $hash1 $hash2` instead.
+ - `nix hash to-base32 $hash1 $hash2`: Use `nix hash convert --to nix32 $hash1 $hash2` instead.
+ - `nix hash to-base64 $hash1 $hash2`: Use `nix hash convert --to base64 $hash1 $hash2` instead.
+ - `nix hash to-sri $hash1 $hash2`: : Use `nix hash convert --to sri $hash1 $hash2` or even just `nix hash convert $hash1 $hash2` instead.
+
+- Rename hash format `base32` to `nix32` [#9452](https://github.com/NixOS/nix/pull/9452)
+
+ Hash format `base32` was renamed to `nix32` since it used a special Nix-specific character set for
+ [Base32](https://en.wikipedia.org/wiki/Base32).
+
+- `nix profile` now allows referring to elements by human-readable names [#8678](https://github.com/NixOS/nix/pull/8678)
+
+ [`nix profile`](@docroot@/command-ref/new-cli/nix3-profile.md) now uses names to refer to installed packages when running [`list`](@docroot@/command-ref/new-cli/nix3-profile-list.md), [`remove`](@docroot@/command-ref/new-cli/nix3-profile-remove.md) or [`upgrade`](@docroot@/command-ref/new-cli/nix3-profile-upgrade.md) as opposed to indices. Profile element names are generated when a package is installed and remain the same until the package is removed.
+
+ **Warning**: The `manifest.nix` file used to record the contents of profiles has changed. Nix will automatically upgrade profiles to the new version when you modify the profile. After that, the profile can no longer be used by older versions of Nix.
+
+- Give `nix store add` a `--hash-algo` flag [#9809](https://github.com/NixOS/nix/pull/9809)
+
+ Adds a missing feature that was present in the old CLI, and matches our
+ plans to have similar flags for `nix hash convert` and `nix hash path`.
+
+- Coercion errors include the failing value
+
+ The `error: cannot coerce a to a string` message now includes the value
+ which caused the error.
+
+ Before:
+
+ ```
+ error: cannot coerce a set to a string
+ ```
+
+ After:
+
+ ```
+ error: cannot coerce a set to a string: { aesSupport = «thunk»;
+ avx2Support = «thunk»; avx512Support = «thunk»; avxSupport = «thunk»;
+ canExecute = «thunk»; config = «thunk»; darwinArch = «thunk»; darwinMinVersion
+ = «thunk»; darwinMinVersionVariable = «thunk»; darwinPlatform = «thunk»; «84
+ attributes elided»}
+ ```
+
+- Type errors include the failing value
+
+ In errors like `value is an integer while a list was expected`, the message now
+ includes the failing value.
+
+ Before:
+
+ ```
+ error: value is a set while a string was expected
+ ```
+
+ After:
+
+ ```
+ error: expected a string but found a set: { ghc810 = «thunk»;
+ ghc8102Binary = «thunk»; ghc8107 = «thunk»; ghc8107Binary = «thunk»;
+ ghc865Binary = «thunk»; ghc90 = «thunk»; ghc902 = «thunk»; ghc92 = «thunk»;
+ ghc924Binary = «thunk»; ghc925 = «thunk»; «17 attributes elided»}
+ ```
+
+- Source locations are printed more consistently in errors [#561](https://github.com/NixOS/nix/issues/561) [#9555](https://github.com/NixOS/nix/pull/9555)
+
+ Source location information is now included in error messages more
+ consistently. Given this code:
+
+ ```nix
+ let
+ attr = {foo = "bar";};
+ key = {};
+ in
+ attr.${key}
+ ```
+
+ Previously, Nix would show this unhelpful message when attempting to evaluate
+ it:
+
+ ```
+ error:
+ … while evaluating an attribute name
+
+ error: value is a set while a string was expected
+ ```
+
+ Now, the error message displays where the problematic value was found:
+
+ ```
+ error:
+ … while evaluating an attribute name
+
+ at bad.nix:4:11:
+
+ 3| key = {};
+ 4| in attr.${key}
+ | ^
+ 5|
+
+ error: expected a string but found a set
+ ```
+
+- Some stack overflow segfaults are fixed [#9616](https://github.com/NixOS/nix/issues/9616) [#9617](https://github.com/NixOS/nix/pull/9617)
+
+ The number of nested function calls has been restricted, to detect and report
+ infinite function call recursions. The default maximum call depth is 10,000 and
+ can be set with [the `max-call-depth`
+ option](@docroot@/command-ref/conf-file.md#conf-max-call-depth).
+
+ This replaces the `stack overflow (possible infinite recursion)` message.
+
+- Better error reporting for `with` expressions [#9658](https://github.com/NixOS/nix/pull/9658)
+
+ `with` expressions using non-attrset values to resolve variables are now reported with proper positions, e.g.
+
+ ```
+ nix-repl> with 1; a
+ error:
+ … while evaluating the first subexpression of a with expression
+ at «string»:1:1:
+ 1| with 1; a
+ | ^
+
+ error: expected a set but found an integer
+ ```
+
+- Functions are printed with more detail [#7145](https://github.com/NixOS/nix/issues/7145) [#9606](https://github.com/NixOS/nix/pull/9606)
+
+ `nix repl`, `nix eval`, `builtins.trace`, and most other places values are
+ printed will now include function names and source location information:
+
+ ```
+ $ nix repl nixpkgs
+ nix-repl> builtins.map
+ «primop map»
+
+ nix-repl> builtins.map lib.id
+ «partially applied primop map»
+
+ nix-repl> builtins.trace lib.id "my-value"
+ trace: «lambda id @ /nix/store/8rrzq23h2zq7sv5l2vhw44kls5w0f654-source/lib/trivial.nix:26:5»
+ "my-value"
+ ```
+
+- Flake operations like `nix develop` will no longer fail when run in a Git
+ repository where the `flake.lock` file is `.gitignore`d
+ [#8854](https://github.com/NixOS/nix/issues/8854)
+ [#9324](https://github.com/NixOS/nix/pull/9324)
+
+- Nix commands will now respect Ctrl-C
+ [#7145](https://github.com/NixOS/nix/issues/7145)
+ [#6995](https://github.com/NixOS/nix/pull/6995)
+ [#9687](https://github.com/NixOS/nix/pull/9687)
+
+ Previously, many Nix commands would hang indefinitely if Ctrl-C was pressed
+ while performing various operations (including `nix develop`, `nix flake
+ update`, and so on). With several fixes to Nix's signal handlers, Nix
+ commands will now exit quickly after Ctrl-C is pressed.
diff --git a/doc/manual/src/release-notes/rl-2.21.md b/doc/manual/src/release-notes/rl-2.21.md
new file mode 100644
index 000000000..75114f117
--- /dev/null
+++ b/doc/manual/src/release-notes/rl-2.21.md
@@ -0,0 +1,302 @@
+# Release 2.21.0 (2024-03-11)
+
+- Fix a fixed-output derivation sandbox escape (CVE-2024-27297)
+
+ Cooperating Nix derivations could send file descriptors to files in the Nix
+ store to each other via Unix domain sockets in the abstract namespace. This
+ allowed one derivation to modify the output of the other derivation, after Nix
+ has registered the path as "valid" and immutable in the Nix database.
+ In particular, this allowed the output of fixed-output derivations to be
+ modified from their expected content.
+
+ This isn't the case any more.
+
+- CLI options `--arg-from-file` and `--arg-from-stdin` [#10122](https://github.com/NixOS/nix/pull/10122)
+
+ The new CLI option `--arg-from-file` *name* *path* passes the contents
+ of file *path* as a string value via the function argument *name* to a
+ Nix expression. Similarly, the new option `--arg-from-stdin` *name*
+ reads the contents of the string from standard input.
+
+- Concise error printing in `nix repl` [#9928](https://github.com/NixOS/nix/pull/9928)
+
+ Previously, if an element of a list or attribute set threw an error while
+ evaluating, `nix repl` would print the entire error (including source location
+ information) inline. This output was clumsy and difficult to parse:
+
+ ```
+ nix-repl> { err = builtins.throw "uh oh!"; }
+ { err = «error:
+ … while calling the 'throw' builtin
+ at «string»:1:9:
+ 1| { err = builtins.throw "uh oh!"; }
+ | ^
+
+ error: uh oh!»; }
+ ```
+
+ Now, only the error message is displayed, making the output much more readable.
+ ```
+ nix-repl> { err = builtins.throw "uh oh!"; }
+ { err = «error: uh oh!»; }
+ ```
+
+ However, if the whole expression being evaluated throws an error, source
+ locations and (if applicable) a stack trace are printed, just like you'd expect:
+
+ ```
+ nix-repl> builtins.throw "uh oh!"
+ error:
+ … while calling the 'throw' builtin
+ at «string»:1:1:
+ 1| builtins.throw "uh oh!"
+ | ^
+
+ error: uh oh!
+ ```
+
+- `--debugger` can now access bindings from `let` expressions [#8827](https://github.com/NixOS/nix/issues/8827) [#9918](https://github.com/NixOS/nix/pull/9918)
+
+ Breakpoints and errors in the bindings of a `let` expression can now access
+ those bindings in the debugger. Previously, only the body of `let` expressions
+ could access those bindings.
+
+- Enter the `--debugger` when `builtins.trace` is called if `debugger-on-trace` is set [#9914](https://github.com/NixOS/nix/pull/9914)
+
+ If the `debugger-on-trace` option is set and `--debugger` is given,
+ `builtins.trace` calls will behave similarly to `builtins.break` and will enter
+ the debug REPL. This is useful for determining where warnings are being emitted
+ from.
+
+- Debugger prints source position information [#9913](https://github.com/NixOS/nix/pull/9913)
+
+ The `--debugger` now prints source location information, instead of the
+ pointers of source location information. Before:
+
+ ```
+ nix-repl> :bt
+ 0: while evaluating the attribute 'python311.pythonForBuild.pkgs'
+ 0x600001522598
+ ```
+
+ After:
+
+ ```
+ 0: while evaluating the attribute 'python311.pythonForBuild.pkgs'
+ /nix/store/hg65h51xnp74ikahns9hyf3py5mlbbqq-source/overrides/default.nix:132:27
+
+ 131|
+ 132| bootstrappingBase = pkgs.${self.python.pythonAttr}.pythonForBuild.pkgs;
+ | ^
+ 133| in
+ ```
+
+- The `--debugger` will start more reliably in `let` expressions and function calls [#6649](https://github.com/NixOS/nix/issues/6649) [#9917](https://github.com/NixOS/nix/pull/9917)
+
+ Previously, if you attempted to evaluate this file with the debugger:
+
+ ```nix
+ let
+ a = builtins.trace "before inner break" (
+ builtins.break "hello"
+ );
+ b = builtins.trace "before outer break" (
+ builtins.break a
+ );
+ in
+ b
+ ```
+
+ Nix would correctly enter the debugger at `builtins.break a`, but if you asked
+ it to `:continue`, it would skip over the `builtins.break "hello"` expression
+ entirely.
+
+ Now, Nix will correctly enter the debugger at both breakpoints.
+
+- Nested debuggers are no longer supported [#9920](https://github.com/NixOS/nix/pull/9920)
+
+ Previously, evaluating an expression that throws an error in the debugger would
+ enter a second, nested debugger:
+
+ ```
+ nix-repl> builtins.throw "what"
+ error: what
+
+
+ Starting REPL to allow you to inspect the current state of the evaluator.
+
+ Welcome to Nix 2.18.1. Type :? for help.
+
+ nix-repl>
+ ```
+
+ Now, it just prints the error message like `nix repl`:
+
+ ```
+ nix-repl> builtins.throw "what"
+ error:
+ … while calling the 'throw' builtin
+ at «string»:1:1:
+ 1| builtins.throw "what"
+ | ^
+
+ error: what
+ ```
+
+- Consistent order of function arguments in printed expressions [#9874](https://github.com/NixOS/nix/pull/9874)
+
+ Function arguments are now printed in lexicographic order rather than the internal, creation-time based symbol order.
+
+- Fix duplicate attribute error positions for `inherit` [#9874](https://github.com/NixOS/nix/pull/9874)
+
+ When an `inherit` caused a duplicate attribute error the position of the error was not reported correctly, placing the error with the inherit itself or at the start of the bindings block instead of the offending attribute name.
+
+- `inherit (x) ...` evaluates `x` only once [#9847](https://github.com/NixOS/nix/pull/9847)
+
+ `inherit (x) a b ...` now evaluates the expression `x` only once for all inherited attributes rather than once for each inherited attribute.
+ This does not usually have a measurable impact, but side-effects (such as `builtins.trace`) would be duplicated and expensive expressions (such as derivations) could cause a measurable slowdown.
+
+- Store paths are allowed to start with `.` [#912](https://github.com/NixOS/nix/issues/912) [#9091](https://github.com/NixOS/nix/pull/9091) [#9095](https://github.com/NixOS/nix/pull/9095) [#9120](https://github.com/NixOS/nix/pull/9120) [#9121](https://github.com/NixOS/nix/pull/9121) [#9122](https://github.com/NixOS/nix/pull/9122) [#9130](https://github.com/NixOS/nix/pull/9130) [#9219](https://github.com/NixOS/nix/pull/9219) [#9224](https://github.com/NixOS/nix/pull/9224) [#9867](https://github.com/NixOS/nix/pull/9867)
+
+ Leading periods were allowed by accident in Nix 2.4. The Nix team has considered this to be a bug, but this behavior has since been relied on by users, leading to unnecessary difficulties.
+ From now on, leading periods are supported. The names `.` and `..` are disallowed, as well as those starting with `.-` or `..-`.
+
+ Nix versions that denied leading periods are documented [in the issue](https://github.com/NixOS/nix/issues/912#issuecomment-1919583286).
+
+- `nix repl` pretty-prints values [#9931](https://github.com/NixOS/nix/pull/9931)
+
+ `nix repl` will now pretty-print values:
+
+ ```
+ {
+ attrs = {
+ a = {
+ b = {
+ c = { };
+ };
+ };
+ };
+ list = [ 1 ];
+ list' = [
+ 1
+ 2
+ 3
+ ];
+ }
+ ```
+
+- Introduction of `--regex` and `--all` in `nix profile remove` and `nix profile upgrade` [#10166](https://github.com/NixOS/nix/pull/10166)
+
+ Previously the command-line arguments for `nix profile remove` and `nix profile upgrade` matched the package entries using regular expression.
+ For instance:
+
+ ```
+ nix profile remove '.*vim.*'
+ ```
+
+ This would remove all packages that contain `vim` in their name.
+
+ In most cases, only singular package names were used to remove and upgrade packages. Mixing this with regular expressions sometimes lead to unintended behavior. For instance, `python3.1` could match `python311`.
+
+ To avoid unintended behavior, the arguments are now only matching exact names.
+
+ Matching using regular expressions is still possible by using the new `--regex` flag:
+
+ ```
+ nix profile remove --regex '.*vim.*'
+ ```
+
+ One of the most useful cases for using regular expressions was to upgrade all packages. This was previously accomplished by:
+
+ ```
+ nix profile upgrade '.*'
+ ```
+
+ With the introduction of the `--all` flag, this now becomes more straightforward:
+
+ ```
+ nix profile upgrade --all
+ ```
+
+- Visual clutter in `--debugger` is reduced [#9919](https://github.com/NixOS/nix/pull/9919)
+
+ Before:
+ ```
+ info: breakpoint reached
+
+
+ Starting REPL to allow you to inspect the current state of the evaluator.
+
+ Welcome to Nix 2.20.0pre20231222_dirty. Type :? for help.
+
+ nix-repl> :continue
+ error: uh oh
+
+
+ Starting REPL to allow you to inspect the current state of the evaluator.
+
+ Welcome to Nix 2.20.0pre20231222_dirty. Type :? for help.
+
+ nix-repl>
+ ```
+
+ After:
+
+ ```
+ info: breakpoint reached
+
+ Nix 2.20.0pre20231222_dirty debugger
+ Type :? for help.
+ nix-repl> :continue
+ error: uh oh
+
+ nix-repl>
+ ```
+
+- Cycle detection in `nix repl` is simpler and more reliable [#8672](https://github.com/NixOS/nix/issues/8672) [#9926](https://github.com/NixOS/nix/pull/9926)
+
+ The cycle detection in `nix repl`, `nix eval`, `builtins.trace`, and everywhere
+ else values are printed is now simpler and matches the cycle detection in
+ `nix-instantiate --eval` output.
+
+ Before:
+
+ ```
+ nix eval --expr 'let self = { inherit self; }; in self'
+ { self = { self = «repeated»; }; }
+ ```
+
+ After:
+
+ ```
+ { self = «repeated»; }
+ ```
+
+- In the debugger, `while evaluating the attribute` errors now include position information [#9915](https://github.com/NixOS/nix/pull/9915)
+
+ Before:
+
+ ```
+ 0: while evaluating the attribute 'python311.pythonForBuild.pkgs'
+ 0x600001522598
+ ```
+
+ After:
+
+ ```
+ 0: while evaluating the attribute 'python311.pythonForBuild.pkgs'
+ /nix/store/hg65h51xnp74ikahns9hyf3py5mlbbqq-source/overrides/default.nix:132:27
+
+ 131|
+ 132| bootstrappingBase = pkgs.${self.python.pythonAttr}.pythonForBuild.pkgs;
+ | ^
+ 133| in
+ ```
+
+- Stack size is increased on macOS [#9860](https://github.com/NixOS/nix/pull/9860)
+
+ Previously, Nix would set the stack size to 64MiB on Linux, but would leave the
+ stack size set to the default (approximately 8KiB) on macOS. Now, the stack
+ size is correctly set to 64MiB on macOS as well, which should reduce stack
+ overflow segfaults in deeply-recursive Nix expressions.
+
diff --git a/flake.lock b/flake.lock
index f120d3b5f..bb2e400c0 100644
--- a/flake.lock
+++ b/flake.lock
@@ -32,34 +32,18 @@
"type": "github"
}
},
- "lowdown-src": {
- "flake": false,
- "locked": {
- "lastModified": 1633514407,
- "narHash": "sha256-Dw32tiMjdK9t3ETl5fzGrutQTzh2rufgZV4A/BbxuD4=",
- "owner": "kristapsdz",
- "repo": "lowdown",
- "rev": "d2c2b44ff6c27b936ec27358a2653caaef8f73b8",
- "type": "github"
- },
- "original": {
- "owner": "kristapsdz",
- "repo": "lowdown",
- "type": "github"
- }
- },
"nixpkgs": {
"locked": {
- "lastModified": 1700748986,
- "narHash": "sha256-/nqLrNU297h3PCw4QyDpZKZEUHmialJdZW2ceYFobds=",
+ "lastModified": 1709083642,
+ "narHash": "sha256-7kkJQd4rZ+vFrzWu8sTRtta5D1kBG0LSRYAfhtmMlSo=",
"owner": "NixOS",
"repo": "nixpkgs",
- "rev": "9ba29e2346bc542e9909d1021e8fd7d4b3f64db0",
+ "rev": "b550fe4b4776908ac2a861124307045f8e717c8e",
"type": "github"
},
"original": {
"owner": "NixOS",
- "ref": "nixos-23.05-small",
+ "ref": "release-23.11",
"repo": "nixpkgs",
"type": "github"
}
@@ -84,7 +68,6 @@
"inputs": {
"flake-compat": "flake-compat",
"libgit2": "libgit2",
- "lowdown-src": "lowdown-src",
"nixpkgs": "nixpkgs",
"nixpkgs-regression": "nixpkgs-regression"
}
diff --git a/flake.nix b/flake.nix
index 489fc625f..660527636 100644
--- a/flake.nix
+++ b/flake.nix
@@ -6,22 +6,21 @@
description = "The purely functional package manager - but super!";
- inputs.nixpkgs.url = "github:NixOS/nixpkgs/nixos-23.05-small";
+ # TODO switch to nixos-23.11-small
+ # https://nixpk.gs/pr-tracker.html?pr=291954
+ inputs.nixpkgs.url = "github:NixOS/nixpkgs/release-23.11";
inputs.nixpkgs-regression.url = "github:NixOS/nixpkgs/215d4d0fd80ca5163643b03a33fde804a29cc1e2";
- inputs.lowdown-src = { url = "github:kristapsdz/lowdown"; flake = false; };
inputs.flake-compat = { url = "github:edolstra/flake-compat"; flake = false; };
inputs.libgit2 = { url = "github:libgit2/libgit2"; flake = false; };
- outputs = { self, nixpkgs, nixpkgs-regression, lowdown-src, flake-compat, libgit2 }:
+ outputs = { self, nixpkgs, nixpkgs-regression, libgit2, ... }:
let
inherit (nixpkgs) lib;
+ inherit (lib) fileset;
officialRelease = false;
- # Set to true to build the release notes for the next release.
- buildUnreleasedNotes = false;
-
version = lib.fileContents ./.version + versionSuffix;
versionSuffix =
if officialRelease
@@ -35,11 +34,25 @@
systems = linuxSystems ++ darwinSystems;
crossSystems = [
- "armv6l-linux" "armv7l-linux"
- "x86_64-freebsd13" "x86_64-netbsd"
+ "armv6l-unknown-linux-gnueabihf"
+ "armv7l-unknown-linux-gnueabihf"
+ "x86_64-unknown-netbsd"
];
- stdenvs = [ "gccStdenv" "clangStdenv" "clang11Stdenv" "stdenv" "libcxxStdenv" "ccacheStdenv" ];
+ # Nix doesn't yet build on this platform, so we put it in a
+ # separate list. We just use this for `devShells` and
+ # `nixpkgsFor`, which this depends on.
+ shellCrossSystems = crossSystems ++ [
+ "x86_64-w64-mingw32"
+ ];
+
+ stdenvs = [
+ "ccacheStdenv"
+ "clangStdenv"
+ "gccStdenv"
+ "libcxxStdenv"
+ "stdenv"
+ ];
forAllSystems = lib.genAttrs systems;
@@ -54,57 +67,6 @@
})
stdenvs);
- # Experimental fileset library: https://github.com/NixOS/nixpkgs/pull/222981
- # Not an "idiomatic" flake input because:
- # - Propagation to dependent locks: https://github.com/NixOS/nix/issues/7730
- # - Subflake would download redundant and huge parent flake
- # - No git tree hash support: https://github.com/NixOS/nix/issues/6044
- inherit (import (builtins.fetchTarball { url = "https://github.com/NixOS/nix/archive/1bdcd7fc8a6a40b2e805bad759b36e64e911036b.tar.gz"; sha256 = "sha256:14ljlpdsp4x7h1fkhbmc4bd3vsqnx8zdql4h3037wh09ad6a0893"; }))
- fileset;
-
- baseFiles =
- # .gitignore has already been processed, so any changes in it are irrelevant
- # at this point. It is not represented verbatim for test purposes because
- # that would interfere with repo semantics.
- fileset.fileFilter (f: f.name != ".gitignore") ./.;
-
- configureFiles = fileset.unions [
- ./.version
- ./configure.ac
- ./m4
- # TODO: do we really need README.md? It doesn't seem used in the build.
- ./README.md
- ];
-
- topLevelBuildFiles = fileset.unions [
- ./local.mk
- ./Makefile
- ./Makefile.config.in
- ./mk
- ];
-
- functionalTestFiles = fileset.unions [
- ./tests/functional
- (fileset.fileFilter (f: lib.strings.hasPrefix "nix-profile" f.name) ./scripts)
- ];
-
- nixSrc = fileset.toSource {
- root = ./.;
- fileset = fileset.intersect baseFiles (fileset.unions [
- configureFiles
- topLevelBuildFiles
- ./boehmgc-coroutine-sp-fallback.diff
- ./doc
- ./misc
- ./precompiled-headers.h
- ./src
- ./tests/unit
- ./COPYING
- ./scripts/local.mk
- functionalTestFiles
- ]);
- };
-
# Memoize nixpkgs for different platforms for efficiency.
nixpkgsFor = forAllSystems
(system: let
@@ -113,8 +75,8 @@
inherit system;
};
crossSystem = if crossSystem == null then null else {
- system = crossSystem;
- } // lib.optionalAttrs (crossSystem == "x86_64-freebsd13") {
+ config = crossSystem;
+ } // lib.optionalAttrs (crossSystem == "x86_64-unknown-freebsd13") {
useLLVM = true;
};
overlays = [
@@ -126,407 +88,114 @@
in {
inherit stdenvs native;
static = native.pkgsStatic;
- cross = forAllCrossSystems (crossSystem: make-pkgs crossSystem "stdenv");
+ cross = lib.genAttrs shellCrossSystems (crossSystem: make-pkgs crossSystem "stdenv");
});
- commonDeps =
- { pkgs
- , isStatic ? pkgs.stdenv.hostPlatform.isStatic
- }:
- with pkgs; rec {
- # Use "busybox-sandbox-shell" if present,
- # if not (legacy) fallback and hope it's sufficient.
- sh = pkgs.busybox-sandbox-shell or (busybox.override {
- useMusl = true;
- enableStatic = true;
- enableMinimal = true;
- extraConfig = ''
- CONFIG_FEATURE_FANCY_ECHO y
- CONFIG_FEATURE_SH_MATH y
- CONFIG_FEATURE_SH_MATH_64 y
-
- CONFIG_ASH y
- CONFIG_ASH_OPTIMIZE_FOR_SIZE y
-
- CONFIG_ASH_ALIAS y
- CONFIG_ASH_BASH_COMPAT y
- CONFIG_ASH_CMDCMD y
- CONFIG_ASH_ECHO y
- CONFIG_ASH_GETOPTS y
- CONFIG_ASH_INTERNAL_GLOB y
- CONFIG_ASH_JOB_CONTROL y
- CONFIG_ASH_PRINTF y
- CONFIG_ASH_TEST y
- '';
- });
-
- configureFlags =
- lib.optionals stdenv.isLinux [
- "--with-boost=${boost}/lib"
- "--with-sandbox-shell=${sh}/bin/busybox"
- ]
- ++ lib.optionals (stdenv.isLinux && !(isStatic && stdenv.system == "aarch64-linux")) [
- "LDFLAGS=-fuse-ld=gold"
- ];
-
- testConfigureFlags = [
- "RAPIDCHECK_HEADERS=${lib.getDev rapidcheck}/extras/gtest/include"
- ] ++ lib.optionals (stdenv.hostPlatform != stdenv.buildPlatform) [
- "--enable-install-unit-tests"
- "--with-check-bin-dir=${builtins.placeholder "check"}/bin"
- "--with-check-lib-dir=${builtins.placeholder "check"}/lib"
- ];
-
- internalApiDocsConfigureFlags = [
- "--enable-internal-api-docs"
- ];
-
- changelog-d = pkgs.buildPackages.changelog-d;
-
- nativeBuildDeps =
- [
- buildPackages.bison
- buildPackages.flex
- (lib.getBin buildPackages.lowdown-nix)
- buildPackages.mdbook
- buildPackages.mdbook-linkcheck
- buildPackages.autoconf-archive
- buildPackages.autoreconfHook
- buildPackages.pkg-config
-
- # Tests
- buildPackages.git
- buildPackages.mercurial # FIXME: remove? only needed for tests
- buildPackages.jq # Also for custom mdBook preprocessor.
- buildPackages.openssh # only needed for tests (ssh-keygen)
- ]
- ++ lib.optionals stdenv.hostPlatform.isLinux [(buildPackages.util-linuxMinimal or buildPackages.utillinuxMinimal)]
- # Official releases don't have rl-next, so we don't need to compile a changelog
- ++ lib.optional (!officialRelease && buildUnreleasedNotes) changelog-d
- ;
-
- buildDeps =
- [ curl
- bzip2 xz brotli editline
- openssl sqlite
- libarchive
- (pkgs.libgit2.overrideAttrs (attrs: {
- src = libgit2;
- version = libgit2.lastModifiedDate;
- cmakeFlags = (attrs.cmakeFlags or []) ++ ["-DUSE_SSH=exec"];
- }))
- boost
- lowdown-nix
- libsodium
- ]
- ++ lib.optionals stdenv.isLinux [libseccomp]
- ++ lib.optional stdenv.hostPlatform.isx86_64 libcpuid;
-
- checkDeps = [
- gtest
- rapidcheck
- ];
-
- internalApiDocsDeps = [
- buildPackages.doxygen
- ];
-
- awsDeps = lib.optional (stdenv.isLinux || stdenv.isDarwin)
- (aws-sdk-cpp.override {
- apis = ["s3" "transfer"];
- customMemoryManagement = false;
- });
-
- propagatedDeps =
- [ ((boehmgc.override {
- enableLargeConfig = true;
- }).overrideAttrs(o: {
- patches = (o.patches or []) ++ [
- ./boehmgc-coroutine-sp-fallback.diff
-
- # https://github.com/ivmai/bdwgc/pull/586
- ./boehmgc-traceable_allocator-public.diff
- ];
- })
- )
- nlohmann_json
- ];
- };
-
- installScriptFor = systems:
- with nixpkgsFor.x86_64-linux.native;
- runCommand "installer-script"
- { buildInputs = [ nix ];
- }
- ''
- mkdir -p $out/nix-support
-
- # Converts /nix/store/50p3qk8k...-nix-2.4pre20201102_550e11f/bin/nix to 50p3qk8k.../bin/nix.
- tarballPath() {
- # Remove the store prefix
- local path=''${1#${builtins.storeDir}/}
- # Get the path relative to the derivation root
- local rest=''${path#*/}
- # Get the derivation hash
- local drvHash=''${path%%-*}
- echo "$drvHash/$rest"
- }
-
- substitute ${./scripts/install.in} $out/install \
- ${pkgs.lib.concatMapStrings
- (system: let
- tarball = if builtins.elem system crossSystems then self.hydraJobs.binaryTarballCross.x86_64-linux.${system} else self.hydraJobs.binaryTarball.${system};
- in '' \
- --replace '@tarballHash_${system}@' $(nix --experimental-features nix-command hash-file --base16 --type sha256 ${tarball}/*.tar.xz) \
- --replace '@tarballPath_${system}@' $(tarballPath ${tarball}/*.tar.xz) \
- ''
- )
- systems
- } --replace '@nixVersion@' ${version}
-
- echo "file installer $out/install" >> $out/nix-support/hydra-build-products
- '';
-
- testNixVersions = pkgs: client: daemon: with commonDeps { inherit pkgs; }; with pkgs.lib; pkgs.stdenv.mkDerivation {
- NIX_DAEMON_PACKAGE = daemon;
- NIX_CLIENT_PACKAGE = client;
- name =
- "nix-super-tests"
- + optionalString
- (versionAtLeast daemon.version "2.4pre20211005" &&
- versionAtLeast client.version "2.4pre20211005")
- "-${client.version}-against-${daemon.version}";
- inherit version;
-
- src = fileset.toSource {
- root = ./.;
- fileset = fileset.intersect baseFiles (fileset.unions [
- configureFiles
- topLevelBuildFiles
- functionalTestFiles
- ]);
+ installScriptFor = tarballs:
+ nixpkgsFor.x86_64-linux.native.callPackage ./scripts/installer.nix {
+ inherit tarballs;
};
- VERSION_SUFFIX = versionSuffix;
+ testNixVersions = pkgs: client: daemon:
+ pkgs.callPackage ./package.nix {
+ pname =
+ "nix-tests"
+ + lib.optionalString
+ (lib.versionAtLeast daemon.version "2.4pre20211005" &&
+ lib.versionAtLeast client.version "2.4pre20211005")
+ "-${client.version}-against-${daemon.version}";
- nativeBuildInputs = nativeBuildDeps;
- buildInputs = buildDeps ++ awsDeps ++ checkDeps;
- propagatedBuildInputs = propagatedDeps;
+ inherit fileset;
- enableParallelBuilding = true;
+ test-client = client;
+ test-daemon = daemon;
- configureFlags =
- testConfigureFlags # otherwise configure fails
- ++ [ "--disable-build" ];
- dontBuild = true;
- doInstallCheck = true;
+ doBuild = false;
+ };
- installPhase = ''
- mkdir -p $out
- '';
-
- installCheckPhase = ''
- mkdir -p src/nix-channel
- make installcheck -j$NIX_BUILD_CORES -l$NIX_BUILD_CORES
- '';
+ binaryTarball = nix: pkgs: pkgs.callPackage ./scripts/binary-tarball.nix {
+ inherit nix;
};
- binaryTarball = nix: pkgs:
- let
- inherit (pkgs) buildPackages;
- inherit (pkgs) cacert;
- installerClosureInfo = buildPackages.closureInfo { rootPaths = [ nix cacert ]; };
- in
-
- buildPackages.runCommand "nix-super-binary-tarball-${version}"
- { #nativeBuildInputs = lib.optional (system != "aarch64-linux") shellcheck;
- meta.description = "Distribution-independent Nix bootstrap binaries for ${pkgs.system}";
- }
- ''
- cp ${installerClosureInfo}/registration $TMPDIR/reginfo
- cp ${./scripts/create-darwin-volume.sh} $TMPDIR/create-darwin-volume.sh
- substitute ${./scripts/install-nix-from-closure.sh} $TMPDIR/install \
- --subst-var-by nix ${nix} \
- --subst-var-by cacert ${cacert}
-
- substitute ${./scripts/install-darwin-multi-user.sh} $TMPDIR/install-darwin-multi-user.sh \
- --subst-var-by nix ${nix} \
- --subst-var-by cacert ${cacert}
- substitute ${./scripts/install-systemd-multi-user.sh} $TMPDIR/install-systemd-multi-user.sh \
- --subst-var-by nix ${nix} \
- --subst-var-by cacert ${cacert}
- substitute ${./scripts/install-multi-user.sh} $TMPDIR/install-multi-user \
- --subst-var-by nix ${nix} \
- --subst-var-by cacert ${cacert}
-
- if type -p shellcheck; then
- # SC1090: Don't worry about not being able to find
- # $nix/etc/profile.d/nix.sh
- shellcheck --exclude SC1090 $TMPDIR/install
- shellcheck $TMPDIR/create-darwin-volume.sh
- shellcheck $TMPDIR/install-darwin-multi-user.sh
- shellcheck $TMPDIR/install-systemd-multi-user.sh
-
- # SC1091: Don't panic about not being able to source
- # /etc/profile
- # SC2002: Ignore "useless cat" "error", when loading
- # .reginfo, as the cat is a much cleaner
- # implementation, even though it is "useless"
- # SC2116: Allow ROOT_HOME=$(echo ~root) for resolving
- # root's home directory
- shellcheck --external-sources \
- --exclude SC1091,SC2002,SC2116 $TMPDIR/install-multi-user
- fi
-
- chmod +x $TMPDIR/install
- chmod +x $TMPDIR/create-darwin-volume.sh
- chmod +x $TMPDIR/install-darwin-multi-user.sh
- chmod +x $TMPDIR/install-systemd-multi-user.sh
- chmod +x $TMPDIR/install-multi-user
- dir=nix-super-${version}-${pkgs.system}
- fn=$out/$dir.tar.xz
- mkdir -p $out/nix-support
- echo "file binary-dist $fn" >> $out/nix-support/hydra-build-products
- tar cvfJ $fn \
- --owner=0 --group=0 --mode=u+rw,uga+r \
- --mtime='1970-01-01' \
- --absolute-names \
- --hard-dereference \
- --transform "s,$TMPDIR/install,$dir/install," \
- --transform "s,$TMPDIR/create-darwin-volume.sh,$dir/create-darwin-volume.sh," \
- --transform "s,$TMPDIR/reginfo,$dir/.reginfo," \
- --transform "s,$NIX_STORE,$dir/store,S" \
- $TMPDIR/install \
- $TMPDIR/create-darwin-volume.sh \
- $TMPDIR/install-darwin-multi-user.sh \
- $TMPDIR/install-systemd-multi-user.sh \
- $TMPDIR/install-multi-user \
- $TMPDIR/reginfo \
- $(cat ${installerClosureInfo}/store-paths)
- '';
-
overlayFor = getStdenv: final: prev:
- let currentStdenv = getStdenv final; in
+ let
+ stdenv = getStdenv final;
+ in
{
nixStable = prev.nix;
- # Forward from the previous stage as we don’t want it to pick the lowdown override
- nixUnstable = prev.nixUnstable;
+ default-busybox-sandbox-shell = final.busybox.override {
+ useMusl = true;
+ enableStatic = true;
+ enableMinimal = true;
+ extraConfig = ''
+ CONFIG_FEATURE_FANCY_ECHO y
+ CONFIG_FEATURE_SH_MATH y
+ CONFIG_FEATURE_SH_MATH_64 y
- nix =
- with final;
- with commonDeps {
- inherit pkgs;
- inherit (currentStdenv.hostPlatform) isStatic;
- };
- let
- canRunInstalled = currentStdenv.buildPlatform.canExecute currentStdenv.hostPlatform;
- in currentStdenv.mkDerivation (finalAttrs: {
- name = "nix-super-${version}";
- inherit version;
+ CONFIG_ASH y
+ CONFIG_ASH_OPTIMIZE_FOR_SIZE y
- src = nixSrc;
- VERSION_SUFFIX = versionSuffix;
-
- outputs = [ "out" "dev" "doc" ]
- ++ lib.optional (currentStdenv.hostPlatform != currentStdenv.buildPlatform) "check";
-
- nativeBuildInputs = nativeBuildDeps;
- buildInputs = buildDeps
- # There have been issues building these dependencies
- ++ lib.optionals (currentStdenv.hostPlatform == currentStdenv.buildPlatform) awsDeps
- ++ lib.optionals finalAttrs.doCheck checkDeps;
-
- propagatedBuildInputs = propagatedDeps;
-
- disallowedReferences = [ boost ];
-
- preConfigure = lib.optionalString (! currentStdenv.hostPlatform.isStatic)
- ''
- # Copy libboost_context so we don't get all of Boost in our closure.
- # https://github.com/NixOS/nixpkgs/issues/45462
- mkdir -p $out/lib
- cp -pd ${boost}/lib/{libboost_context*,libboost_thread*,libboost_system*} $out/lib
- rm -f $out/lib/*.a
- ${lib.optionalString currentStdenv.hostPlatform.isLinux ''
- chmod u+w $out/lib/*.so.*
- patchelf --set-rpath $out/lib:${currentStdenv.cc.cc.lib}/lib $out/lib/libboost_thread.so.*
- ''}
- ${lib.optionalString currentStdenv.hostPlatform.isDarwin ''
- for LIB in $out/lib/*.dylib; do
- chmod u+w $LIB
- install_name_tool -id $LIB $LIB
- install_name_tool -delete_rpath ${boost}/lib/ $LIB || true
- done
- install_name_tool -change ${boost}/lib/libboost_system.dylib $out/lib/libboost_system.dylib $out/lib/libboost_thread.dylib
- ''}
- '';
-
- configureFlags = configureFlags ++
- [ "--sysconfdir=/etc" ] ++
- lib.optional stdenv.hostPlatform.isStatic "--enable-embedded-sandbox-shell" ++
- [ (lib.enableFeature finalAttrs.doCheck "tests") ] ++
- lib.optionals finalAttrs.doCheck testConfigureFlags ++
- lib.optional (!canRunInstalled) "--disable-doc-gen";
-
- enableParallelBuilding = true;
-
- makeFlags = "profiledir=$(out)/etc/profile.d PRECOMPILE_HEADERS=1";
-
- doCheck = true;
-
- installFlags = "sysconfdir=$(out)/etc";
-
- postInstall = ''
- mkdir -p $doc/nix-support
- echo "doc manual $doc/share/doc/nix/manual" >> $doc/nix-support/hydra-build-products
- ${lib.optionalString currentStdenv.hostPlatform.isStatic ''
- mkdir -p $out/nix-support
- echo "file binary-dist $out/bin/nix" >> $out/nix-support/hydra-build-products
- ''}
- ${lib.optionalString currentStdenv.isDarwin ''
- install_name_tool \
- -change ${boost}/lib/libboost_context.dylib \
- $out/lib/libboost_context.dylib \
- $out/lib/libnixutil.dylib
- ''}
+ CONFIG_ASH_ALIAS y
+ CONFIG_ASH_BASH_COMPAT y
+ CONFIG_ASH_CMDCMD y
+ CONFIG_ASH_ECHO y
+ CONFIG_ASH_GETOPTS y
+ CONFIG_ASH_INTERNAL_GLOB y
+ CONFIG_ASH_JOB_CONTROL y
+ CONFIG_ASH_PRINTF y
+ CONFIG_ASH_TEST y
'';
+ };
- doInstallCheck = finalAttrs.doCheck;
- installCheckFlags = "sysconfdir=$(out)/etc";
- installCheckTarget = "installcheck"; # work around buggy detection in stdenv
-
- separateDebugInfo = !currentStdenv.hostPlatform.isStatic;
-
- strictDeps = true;
-
- hardeningDisable = lib.optional stdenv.hostPlatform.isStatic "pie";
-
- passthru.perl-bindings = final.callPackage ./perl {
- inherit fileset;
- stdenv = currentStdenv;
- };
-
- meta.platforms = lib.platforms.unix;
- meta.mainProgram = "nix";
+ libgit2-nix = final.libgit2.overrideAttrs (attrs: {
+ src = libgit2;
+ version = libgit2.lastModifiedDate;
+ cmakeFlags = attrs.cmakeFlags or []
+ ++ [ "-DUSE_SSH=exec" ];
});
- lowdown-nix = with final; currentStdenv.mkDerivation rec {
- name = "lowdown-0.9.0";
+ boehmgc-nix = (final.boehmgc.override {
+ enableLargeConfig = true;
+ }).overrideAttrs(o: {
+ patches = (o.patches or []) ++ [
+ ./dep-patches/boehmgc-coroutine-sp-fallback.diff
- src = lowdown-src;
+ # https://github.com/ivmai/bdwgc/pull/586
+ ./dep-patches/boehmgc-traceable_allocator-public.diff
+ ];
+ });
- outputs = [ "out" "bin" "dev" ];
+ changelog-d-nix = final.buildPackages.callPackage ./misc/changelog-d.nix { };
- nativeBuildInputs = [ buildPackages.which ];
+ nix =
+ let
+ officialRelease = false;
+ versionSuffix =
+ if officialRelease
+ then ""
+ else "pre${builtins.substring 0 8 (self.lastModifiedDate or self.lastModified or "19700101")}_${self.shortRev or "dirty"}";
- configurePhase = ''
- ${if (currentStdenv.isDarwin && currentStdenv.isAarch64) then "echo \"HAVE_SANDBOX_INIT=false\" > configure.local" else ""}
- ./configure \
- PREFIX=${placeholder "dev"} \
- BINDIR=${placeholder "bin"}/bin
- '';
+ in final.callPackage ./package.nix {
+ inherit
+ fileset
+ stdenv
+ versionSuffix
+ ;
+ officialRelease = false;
+ boehmgc = final.boehmgc-nix;
+ libgit2 = final.libgit2-nix;
+ busybox-sandbox-shell = final.busybox-sandbox-shell or final.default-busybox-sandbox-shell;
+ } // {
+ # this is a proper separate downstream package, but put
+ # here also for back compat reasons.
+ perl-bindings = final.nix-perl-bindings;
+ };
+
+ nix-perl-bindings = final.callPackage ./perl {
+ inherit fileset stdenv;
};
+
};
in {
@@ -539,19 +208,32 @@
# Binary package for various platforms.
build = forAllSystems (system: self.packages.${system}.nix);
+ shellInputs = forAllSystems (system: self.devShells.${system}.default.inputDerivation);
+
buildStatic = lib.genAttrs linux64BitSystems (system: self.packages.${system}.nix-static);
buildCross = forAllCrossSystems (crossSystem:
lib.genAttrs ["x86_64-linux"] (system: self.packages.${system}."nix-${crossSystem}"));
- buildNoGc = forAllSystems (system: self.packages.${system}.nix.overrideAttrs (a: { configureFlags = (a.configureFlags or []) ++ ["--enable-gc=no"];}));
+ buildNoGc = forAllSystems (system:
+ self.packages.${system}.nix.override { enableGC = false; }
+ );
buildNoTests = forAllSystems (system:
- self.packages.${system}.nix.overrideAttrs (a: {
- doCheck =
- assert ! a?dontCheck;
- false;
- })
+ self.packages.${system}.nix.override {
+ doCheck = false;
+ doInstallCheck = false;
+ installUnitTests = false;
+ }
+ );
+
+ # Toggles some settings for better coverage. Windows needs these
+ # library combinations, and Debian build Nix with GNU readline too.
+ buildReadlineNoMarkdown = forAllSystems (system:
+ self.packages.${system}.nix.override {
+ enableMarkdown = false;
+ readlineFlavor = "readline";
+ }
);
# Perl bindings for various platforms.
@@ -572,67 +254,41 @@
# to https://nixos.org/nix/install. It downloads the binary
# tarball for the user's system and calls the second half of the
# installation script.
- installerScript = installScriptFor [ "x86_64-linux" "i686-linux" "aarch64-linux" "x86_64-darwin" "aarch64-darwin" "armv6l-linux" "armv7l-linux" ];
- installerScriptForGHA = installScriptFor [ "x86_64-linux" "x86_64-darwin" "armv6l-linux" "armv7l-linux"];
+ installerScript = installScriptFor [
+ # Native
+ self.hydraJobs.binaryTarball."x86_64-linux"
+ self.hydraJobs.binaryTarball."i686-linux"
+ self.hydraJobs.binaryTarball."aarch64-linux"
+ self.hydraJobs.binaryTarball."x86_64-darwin"
+ self.hydraJobs.binaryTarball."aarch64-darwin"
+ # Cross
+ self.hydraJobs.binaryTarballCross."x86_64-linux"."armv6l-unknown-linux-gnueabihf"
+ self.hydraJobs.binaryTarballCross."x86_64-linux"."armv7l-unknown-linux-gnueabihf"
+ ];
+ installerScriptForGHA = installScriptFor [
+ # Native
+ self.hydraJobs.binaryTarball."x86_64-linux"
+ self.hydraJobs.binaryTarball."x86_64-darwin"
+ # Cross
+ self.hydraJobs.binaryTarballCross."x86_64-linux"."armv6l-unknown-linux-gnueabihf"
+ self.hydraJobs.binaryTarballCross."x86_64-linux"."armv7l-unknown-linux-gnueabihf"
+ ];
# docker image with Nix inside
dockerImage = lib.genAttrs linux64BitSystems (system: self.packages.${system}.dockerImage);
# Line coverage analysis.
- coverage =
- with nixpkgsFor.x86_64-linux.native;
- with commonDeps { inherit pkgs; };
-
- releaseTools.coverageAnalysis {
- name = "nix-super-coverage-${version}";
-
- src = nixSrc;
-
- configureFlags = testConfigureFlags;
-
- enableParallelBuilding = true;
-
- nativeBuildInputs = nativeBuildDeps;
- buildInputs = buildDeps ++ propagatedDeps ++ awsDeps ++ checkDeps;
-
- dontInstall = false;
-
- doInstallCheck = true;
- installCheckTarget = "installcheck"; # work around buggy detection in stdenv
-
- lcovFilter = [ "*/boost/*" "*-tab.*" ];
-
- hardeningDisable = ["fortify"];
-
- NIX_CFLAGS_COMPILE = "-DCOVERAGE=1";
- };
+ coverage = nixpkgsFor.x86_64-linux.native.nix.override {
+ pname = "nix-coverage";
+ withCoverageChecks = true;
+ };
# API docs for Nix's unstable internal C++ interfaces.
- internal-api-docs =
- with nixpkgsFor.x86_64-linux.native;
- with commonDeps { inherit pkgs; };
-
- stdenv.mkDerivation {
- pname = "nix-internal-api-docs";
- inherit version;
-
- src = nixSrc;
-
- configureFlags = testConfigureFlags ++ internalApiDocsConfigureFlags;
-
- nativeBuildInputs = nativeBuildDeps;
- buildInputs = buildDeps ++ propagatedDeps
- ++ awsDeps ++ checkDeps ++ internalApiDocsDeps;
-
- dontBuild = true;
-
- installTargets = [ "internal-api-html" ];
-
- postInstall = ''
- mkdir -p $out/nix-support
- echo "doc internal-api-docs $out/share/doc/nix/internal-api/html" >> $out/nix-support/hydra-build-products
- '';
- };
+ internal-api-docs = nixpkgsFor.x86_64-linux.native.callPackage ./package.nix {
+ inherit fileset;
+ doBuild = false;
+ enableInternalAPIDocs = true;
+ };
# System tests.
tests = import ./tests/nixos { inherit lib nixpkgs nixpkgsFor; } // {
@@ -640,13 +296,18 @@
# Make sure that nix-env still produces the exact same result
# on a particular version of Nixpkgs.
evalNixpkgs =
- with nixpkgsFor.x86_64-linux.native;
+ let
+ inherit (nixpkgsFor.x86_64-linux.native) runCommand nix;
+ in
runCommand "eval-nixos" { buildInputs = [ nix ]; }
''
type -p nix-env
# Note: we're filtering out nixos-install-tools because https://github.com/NixOS/nixpkgs/pull/153594#issuecomment-1020530593.
- time nix-env --store dummy:// -f ${nixpkgs-regression} -qaP --drv-path | sort | grep -v nixos-install-tools > packages
- [[ $(sha1sum < packages | cut -c1-40) = ff451c521e61e4fe72bdbe2d0ca5d1809affa733 ]]
+ (
+ set -x
+ time nix-env --store dummy:// -f ${nixpkgs-regression} -qaP --drv-path | sort | grep -v nixos-install-tools > packages
+ [[ $(sha1sum < packages | cut -c1-40) = e01b031fc9785a572a38be6bc473957e3b6faad7 ]]
+ )
mkdir $out
'';
@@ -687,15 +348,24 @@
checks = forAllSystems (system: {
binaryTarball = self.hydraJobs.binaryTarball.${system};
- perlBindings = self.hydraJobs.perlBindings.${system};
installTests = self.hydraJobs.installTests.${system};
nixpkgsLibTests = self.hydraJobs.tests.nixpkgsLibTests.${system};
+ rl-next =
+ let pkgs = nixpkgsFor.${system}.native;
+ in pkgs.buildPackages.runCommand "test-rl-next-release-notes" { } ''
+ LANG=C.UTF-8 ${pkgs.changelog-d-nix}/bin/changelog-d ${./doc/manual/rl-next} >$out
+ '';
} // (lib.optionalAttrs (builtins.elem system linux64BitSystems)) {
dockerImage = self.hydraJobs.dockerImage.${system};
+ } // (lib.optionalAttrs (!(builtins.elem system linux32BitSystems))) {
+ # Some perl dependencies are broken on i686-linux.
+ # Since the support is only best-effort there, disable the perl
+ # bindings
+ perlBindings = self.hydraJobs.perlBindings.${system};
});
packages = forAllSystems (system: rec {
- inherit (nixpkgsFor.${system}.native) nix;
+ inherit (nixpkgsFor.${system}.native) nix changelog-d-nix;
default = nix;
} // (lib.optionalAttrs (builtins.elem system linux64BitSystems) {
nix-static = nixpkgsFor.${system}.static.nix;
@@ -727,47 +397,23 @@
stdenvs)));
devShells = let
- makeShell = pkgs: stdenv:
- let
- canRunInstalled = stdenv.buildPlatform.canExecute stdenv.hostPlatform;
- in
- with commonDeps { inherit pkgs; };
- stdenv.mkDerivation {
- name = "nix-super";
+ makeShell = pkgs: stdenv: (pkgs.nix.override { inherit stdenv; forDevShell = true; }).overrideAttrs (attrs: {
+ installFlags = "sysconfdir=$(out)/etc";
+ shellHook = ''
+ PATH=$prefix/bin:$PATH
+ unset PYTHONPATH
+ export MANPATH=$out/share/man:$MANPATH
- outputs = [ "out" "dev" "doc" ]
- ++ lib.optional (stdenv.hostPlatform != stdenv.buildPlatform) "check";
+ # Make bash completion work.
+ XDG_DATA_DIRS+=:$out/share
+ '';
- nativeBuildInputs = nativeBuildDeps
- ++ lib.optional stdenv.cc.isClang pkgs.buildPackages.bear
- ++ lib.optional
- (stdenv.cc.isClang && stdenv.hostPlatform == stdenv.buildPlatform)
- pkgs.buildPackages.clang-tools
- # We want changelog-d in the shell even if the current build doesn't need it
- ++ lib.optional (officialRelease || ! buildUnreleasedNotes) changelog-d
- ;
-
- buildInputs = buildDeps ++ propagatedDeps
- ++ awsDeps ++ checkDeps ++ internalApiDocsDeps;
-
- configureFlags = configureFlags
- ++ testConfigureFlags ++ internalApiDocsConfigureFlags
- ++ lib.optional (!canRunInstalled) "--disable-doc-gen";
-
- enableParallelBuilding = true;
-
- installFlags = "sysconfdir=$(out)/etc";
-
- shellHook =
- ''
- PATH=$prefix/bin:$PATH
- unset PYTHONPATH
- export MANPATH=$out/share/man:$MANPATH
-
- # Make bash completion work.
- XDG_DATA_DIRS+=:$out/share
- '';
- };
+ nativeBuildInputs = attrs.nativeBuildInputs or []
+ # TODO: Remove the darwin check once
+ # https://github.com/NixOS/nixpkgs/pull/291814 is available
+ ++ lib.optional (stdenv.cc.isClang && !stdenv.buildPlatform.isDarwin) pkgs.buildPackages.bear
+ ++ lib.optional (stdenv.cc.isClang && stdenv.hostPlatform == stdenv.buildPlatform) pkgs.buildPackages.clang-tools;
+ });
in
forAllSystems (system:
let
@@ -777,8 +423,9 @@
(forAllStdenvs (stdenvName: makeShell pkgs pkgs.${stdenvName}));
in
(makeShells "native" nixpkgsFor.${system}.native) //
- (makeShells "static" nixpkgsFor.${system}.static) //
- (forAllCrossSystems (crossSystem: let pkgs = nixpkgsFor.${system}.cross.${crossSystem}; in makeShell pkgs pkgs.stdenv)) //
+ (lib.optionalAttrs (!nixpkgsFor.${system}.native.stdenv.isDarwin)
+ (makeShells "static" nixpkgsFor.${system}.static)) //
+ (lib.genAttrs shellCrossSystems (crossSystem: let pkgs = nixpkgsFor.${system}.cross.${crossSystem}; in makeShell pkgs pkgs.stdenv)) //
{
default = self.devShells.${system}.native-stdenvPackages;
}
diff --git a/maintainers/README.md b/maintainers/README.md
index ee97c1195..fa321c7c0 100644
--- a/maintainers/README.md
+++ b/maintainers/README.md
@@ -43,7 +43,11 @@ The team meets twice a week:
- Discussion meeting: [Fridays 13:00-14:00 CET](https://calendar.google.com/calendar/event?eid=MHNtOGVuNWtrZXNpZHR2bW1sM3QyN2ZjaGNfMjAyMjExMjVUMTIwMDAwWiBiOW81MmZvYnFqYWs4b3E4bGZraGczdDBxZ0Bn)
1. Triage issues and pull requests from the [No Status](#no-status) column (30 min)
- 2. Discuss issues and pull requests from the [To discuss](#to-discuss) column (30 min)
+ 2. Discuss issues and pull requests from the [To discuss](#to-discuss) column (30 min).
+ Once a month, each team member checks the [Assigned](#assigned) column for prs/issues assigned to them, to either
+ - unblock it by providing input
+ - mark it as draft if it is blocked on the contributor
+ - escalate it back to the team by moving it to To discuss, and leaving a comment as to why the issue needs to be discussed again.
- Work meeting: [Mondays 13:00-15:00 CET](https://calendar.google.com/calendar/event?eid=NTM1MG1wNGJnOGpmOTZhYms3bTB1bnY5cWxfMjAyMjExMjFUMTIwMDAwWiBiOW81MmZvYnFqYWs4b3E4bGZraGczdDBxZ0Bn)
diff --git a/maintainers/release-notes b/maintainers/release-notes
index 34cd85a56..2d84485c1 100755
--- a/maintainers/release-notes
+++ b/maintainers/release-notes
@@ -1,7 +1,5 @@
-#!/usr/bin/env nix-shell
-#!nix-shell -i bash ../shell.nix -I nixpkgs=channel:nixos-unstable-small
-# ^^^^^^^
-# Only used for bash. shell.nix goes to the flake.
+#!/usr/bin/env nix
+#!nix shell .#changelog-d-nix --command bash
# --- CONFIGURATION ---
diff --git a/maintainers/release-process.md b/maintainers/release-process.md
index db8b064a5..da6886ea9 100644
--- a/maintainers/release-process.md
+++ b/maintainers/release-process.md
@@ -27,8 +27,9 @@ release:
* Compile the release notes by running
```console
+ $ export VERSION=X.YY
$ git checkout -b release-notes
- $ VERSION=X.YY ./maintainers/release-notes
+ $ ./maintainers/release-notes
```
where `X.YY` is *without* the patch level, e.g. `2.12` rather than ~~`2.12.0`~~.
diff --git a/maintainers/upload-release.pl b/maintainers/upload-release.pl
index ebc536f12..f2830a3af 100755
--- a/maintainers/upload-release.pl
+++ b/maintainers/upload-release.pl
@@ -11,6 +11,8 @@ use JSON::PP;
use LWP::UserAgent;
use Net::Amazon::S3;
+delete $ENV{'shell'}; # shut up a LWP::UserAgent.pm warning
+
my $evalId = $ARGV[0] or die "Usage: $0 EVAL-ID\n";
my $releasesBucketName = "nix-releases";
@@ -36,9 +38,9 @@ sub fetch {
my $evalUrl = "https://hydra.nixos.org/eval/$evalId";
my $evalInfo = decode_json(fetch($evalUrl, 'application/json'));
#print Dumper($evalInfo);
-my $flakeUrl = $evalInfo->{flake} or die;
-my $flakeInfo = decode_json(`nix flake metadata --json "$flakeUrl"` or die);
-my $nixRev = $flakeInfo->{revision} or die;
+my $flakeUrl = $evalInfo->{flake};
+my $flakeInfo = decode_json(`nix flake metadata --json "$flakeUrl"` or die) if $flakeUrl;
+my $nixRev = ($flakeInfo ? $flakeInfo->{revision} : $evalInfo->{jobsetevalinputs}->{nix}->{revision}) or die;
my $buildInfo = decode_json(fetch("$evalUrl/job/build.x86_64-linux", 'application/json'));
#print Dumper($buildInfo);
@@ -83,12 +85,19 @@ my $channelsBucket = $s3_us->bucket($channelsBucketName) or die;
sub getStorePath {
my ($jobName, $output) = @_;
my $buildInfo = decode_json(fetch("$evalUrl/job/$jobName", 'application/json'));
- return $buildInfo->{buildoutputs}->{$output or "out"}->{path} or die "cannot get store path for '$jobName'";
+ return $buildInfo->{buildoutputs}->{$output or "out"}->{path} // die "cannot get store path for '$jobName'";
}
sub copyManual {
- my $manual = getStorePath("build.x86_64-linux", "doc");
- print "$manual\n";
+ my $manual;
+ eval {
+ $manual = getStorePath("build.x86_64-linux", "doc");
+ };
+ if ($@) {
+ warn "$@";
+ return;
+ }
+ print "Manual: $manual\n";
my $manualNar = "$tmpDir/$releaseName-manual.nar.xz";
print "$manualNar\n";
@@ -154,19 +163,33 @@ downloadFile("binaryTarball.x86_64-linux", "1");
downloadFile("binaryTarball.aarch64-linux", "1");
downloadFile("binaryTarball.x86_64-darwin", "1");
downloadFile("binaryTarball.aarch64-darwin", "1");
-downloadFile("binaryTarballCross.x86_64-linux.armv6l-linux", "1");
-downloadFile("binaryTarballCross.x86_64-linux.armv7l-linux", "1");
+eval {
+ downloadFile("binaryTarballCross.x86_64-linux.armv6l-unknown-linux-gnueabihf", "1");
+};
+warn "$@" if $@;
+eval {
+ downloadFile("binaryTarballCross.x86_64-linux.armv7l-unknown-linux-gnueabihf", "1");
+};
+warn "$@" if $@;
downloadFile("installerScript", "1");
# Upload docker images to dockerhub.
my $dockerManifest = "";
my $dockerManifestLatest = "";
+my $haveDocker = 0;
for my $platforms (["x86_64-linux", "amd64"], ["aarch64-linux", "arm64"]) {
my $system = $platforms->[0];
my $dockerPlatform = $platforms->[1];
my $fn = "nix-$version-docker-image-$dockerPlatform.tar.gz";
- downloadFile("dockerImage.$system", "1", $fn);
+ eval {
+ downloadFile("dockerImage.$system", "1", $fn);
+ };
+ if ($@) {
+ warn "$@" if $@;
+ next;
+ }
+ $haveDocker = 1;
print STDERR "loading docker image for $dockerPlatform...\n";
system("docker load -i $tmpDir/$fn") == 0 or die;
@@ -194,21 +217,23 @@ for my $platforms (["x86_64-linux", "amd64"], ["aarch64-linux", "arm64"]) {
$dockerManifestLatest .= " --amend $latestTag"
}
-print STDERR "creating multi-platform docker manifest...\n";
-system("docker manifest rm nixos/nix:$version");
-system("docker manifest create nixos/nix:$version $dockerManifest") == 0 or die;
-if ($isLatest) {
- print STDERR "creating latest multi-platform docker manifest...\n";
- system("docker manifest rm nixos/nix:latest");
- system("docker manifest create nixos/nix:latest $dockerManifestLatest") == 0 or die;
-}
+if ($haveDocker) {
+ print STDERR "creating multi-platform docker manifest...\n";
+ system("docker manifest rm nixos/nix:$version");
+ system("docker manifest create nixos/nix:$version $dockerManifest") == 0 or die;
+ if ($isLatest) {
+ print STDERR "creating latest multi-platform docker manifest...\n";
+ system("docker manifest rm nixos/nix:latest");
+ system("docker manifest create nixos/nix:latest $dockerManifestLatest") == 0 or die;
+ }
-print STDERR "pushing multi-platform docker manifest...\n";
-system("docker manifest push nixos/nix:$version") == 0 or die;
+ print STDERR "pushing multi-platform docker manifest...\n";
+ system("docker manifest push nixos/nix:$version") == 0 or die;
-if ($isLatest) {
- print STDERR "pushing latest multi-platform docker manifest...\n";
- system("docker manifest push nixos/nix:latest") == 0 or die;
+ if ($isLatest) {
+ print STDERR "pushing latest multi-platform docker manifest...\n";
+ system("docker manifest push nixos/nix:latest") == 0 or die;
+ }
}
# Upload nix-fallback-paths.nix.
diff --git a/misc/changelog-d.cabal.nix b/misc/changelog-d.cabal.nix
new file mode 100644
index 000000000..76f9353cd
--- /dev/null
+++ b/misc/changelog-d.cabal.nix
@@ -0,0 +1,31 @@
+{ mkDerivation, aeson, base, bytestring, cabal-install-parsers
+, Cabal-syntax, containers, directory, filepath, frontmatter
+, generic-lens-lite, lib, mtl, optparse-applicative, parsec, pretty
+, regex-applicative, text, pkgs
+}:
+let rev = "f30f6969e9cd8b56242309639d58acea21c99d06";
+in
+mkDerivation {
+ pname = "changelog-d";
+ version = "0.1";
+ src = pkgs.fetchurl {
+ name = "changelog-d-${rev}.tar.gz";
+ url = "https://codeberg.org/roberth/changelog-d/archive/${rev}.tar.gz";
+ hash = "sha256-8a2+i5u7YoszAgd5OIEW0eYUcP8yfhtoOIhLJkylYJ4=";
+ } // { inherit rev; };
+ isLibrary = false;
+ isExecutable = true;
+ libraryHaskellDepends = [
+ aeson base bytestring cabal-install-parsers Cabal-syntax containers
+ directory filepath frontmatter generic-lens-lite mtl parsec pretty
+ regex-applicative text
+ ];
+ executableHaskellDepends = [
+ base bytestring Cabal-syntax directory filepath
+ optparse-applicative
+ ];
+ doHaddock = false;
+ description = "Concatenate changelog entries into a single one";
+ license = lib.licenses.gpl3Plus;
+ mainProgram = "changelog-d";
+}
diff --git a/misc/changelog-d.nix b/misc/changelog-d.nix
new file mode 100644
index 000000000..1b20f4596
--- /dev/null
+++ b/misc/changelog-d.nix
@@ -0,0 +1,31 @@
+# Taken temporarily from
+{
+ callPackage,
+ lib,
+ haskell,
+ haskellPackages,
+}:
+
+let
+ hsPkg = haskellPackages.callPackage ./changelog-d.cabal.nix { };
+
+ addCompletions = haskellPackages.generateOptparseApplicativeCompletions ["changelog-d"];
+
+ haskellModifications =
+ lib.flip lib.pipe [
+ addCompletions
+ haskell.lib.justStaticExecutables
+ ];
+
+ mkDerivationOverrides = finalAttrs: oldAttrs: {
+
+ version = oldAttrs.version + "-git-${lib.strings.substring 0 7 oldAttrs.src.rev}";
+
+ meta = oldAttrs.meta // {
+ homepage = "https://codeberg.org/roberth/changelog-d";
+ maintainers = [ lib.maintainers.roberth ];
+ };
+
+ };
+in
+ (haskellModifications hsPkg).overrideAttrs mkDerivationOverrides
diff --git a/mk/compilation-database.mk b/mk/compilation-database.mk
new file mode 100644
index 000000000..f69dc0de0
--- /dev/null
+++ b/mk/compilation-database.mk
@@ -0,0 +1,11 @@
+compile-commands-json-files :=
+
+define write-compile-commands
+ _srcs := $$(sort $$(foreach src, $$($(1)_SOURCES), $$(src)))
+
+ $(1)_COMPILE_COMMANDS_JSON := $$(addprefix $(buildprefix), $$(addsuffix .compile_commands.json, $$(basename $$(_srcs))))
+
+ compile-commands-json-files += $$($(1)_COMPILE_COMMANDS_JSON)
+
+ clean-files += $$($(1)_COMPILE_COMMANDS_JSON)
+endef
diff --git a/mk/cxx-big-literal.mk b/mk/cxx-big-literal.mk
index 85365df8e..d64a171c8 100644
--- a/mk/cxx-big-literal.mk
+++ b/mk/cxx-big-literal.mk
@@ -1,5 +1,5 @@
%.gen.hh: %
- @echo 'R"foo(' >> $@.tmp
+ @echo 'R"__NIX_STR(' >> $@.tmp
$(trace-gen) cat $< >> $@.tmp
- @echo ')foo"' >> $@.tmp
+ @echo ')__NIX_STR"' >> $@.tmp
@mv $@.tmp $@
diff --git a/mk/disable-tests.mk b/mk/disable-tests.mk
deleted file mode 100644
index f72f84412..000000000
--- a/mk/disable-tests.mk
+++ /dev/null
@@ -1,12 +0,0 @@
-# This file is only active for `./configure --disable-tests`.
-# Running `make check` or `make installcheck` would indicate a mistake in the
-# caller.
-
-installcheck:
- @echo "Tests are disabled. Configure without '--disable-tests', or avoid calling 'make installcheck'."
- @exit 1
-
-# This currently has little effect.
-check:
- @echo "Tests are disabled. Configure without '--disable-tests', or avoid calling 'make check'."
- @exit 1
diff --git a/mk/lib.mk b/mk/lib.mk
index 3d503364f..a002d823f 100644
--- a/mk/lib.mk
+++ b/mk/lib.mk
@@ -12,24 +12,7 @@ man-pages :=
install-tests :=
install-tests-groups :=
-ifdef HOST_OS
- HOST_KERNEL = $(firstword $(subst -, ,$(HOST_OS)))
- ifeq ($(HOST_KERNEL), cygwin)
- HOST_CYGWIN = 1
- endif
- ifeq ($(patsubst darwin%,,$(HOST_KERNEL)),)
- HOST_DARWIN = 1
- endif
- ifeq ($(patsubst freebsd%,,$(HOST_KERNEL)),)
- HOST_FREEBSD = 1
- endif
- ifeq ($(HOST_KERNEL), linux)
- HOST_LINUX = 1
- endif
- ifeq ($(patsubst solaris%,,$(HOST_KERNEL)),)
- HOST_SOLARIS = 1
- endif
-endif
+include mk/platform.mk
# Hack to define a literal space.
space :=
@@ -85,6 +68,7 @@ include mk/patterns.mk
include mk/templates.mk
include mk/cxx-big-literal.mk
include mk/tests.mk
+include mk/compilation-database.mk
# Include all sub-Makefiles.
@@ -114,6 +98,17 @@ $(foreach test-group, $(install-tests-groups), \
$(eval $(call run-test,$(test),$(install_test_init))) \
$(eval $(test-group).test-group: $(test).test)))
+# Compilation database.
+$(foreach lib, $(libraries), $(eval $(call write-compile-commands,$(lib))))
+$(foreach prog, $(programs), $(eval $(call write-compile-commands,$(prog))))
+
+compile_commands.json: $(compile-commands-json-files)
+ @jq --slurp '.' $^ >$@
+
+# Include makefiles requiring built programs.
+$(foreach mf, $(makefiles-late), $(eval $(call include-sub-makefile,$(mf))))
+
+
$(foreach file, $(man-pages), $(eval $(call install-data-in, $(file), $(mandir)/man$(patsubst .%,%,$(suffix $(file))))))
diff --git a/mk/libraries.mk b/mk/libraries.mk
index 1bc73d7f7..b99ba2782 100644
--- a/mk/libraries.mk
+++ b/mk/libraries.mk
@@ -3,13 +3,19 @@ libs-list :=
ifdef HOST_DARWIN
SO_EXT = dylib
else
- ifdef HOST_CYGWIN
+ ifdef HOST_WINDOWS
SO_EXT = dll
else
SO_EXT = so
endif
endif
+ifdef HOST_UNIX
+ THREAD_LDFLAGS = -pthread
+else
+ THREAD_LDFLAGS =
+endif
+
# Build a library with symbolic name $(1). The library is defined by
# various variables prefixed by ‘$(1)_’:
#
@@ -59,7 +65,7 @@ define build-library
$(1)_OBJS := $$(addprefix $(buildprefix), $$(addsuffix .o, $$(basename $$(_srcs))))
_libs := $$(foreach lib, $$($(1)_LIBS), $$($$(lib)_PATH))
- ifdef HOST_CYGWIN
+ ifdef HOST_WINDOWS
$(1)_INSTALL_DIR ?= $$(bindir)
else
$(1)_INSTALL_DIR ?= $$(libdir)
@@ -79,7 +85,7 @@ define build-library
endif
else
ifndef HOST_DARWIN
- ifndef HOST_CYGWIN
+ ifndef HOST_WINDOWS
$(1)_LDFLAGS += -Wl,-z,defs
endif
endif
diff --git a/mk/patterns.mk b/mk/patterns.mk
index c81150260..4caa2039e 100644
--- a/mk/patterns.mk
+++ b/mk/patterns.mk
@@ -1,11 +1,41 @@
+
+# These are the complete command lines we use to compile C and C++ files.
+# - $< is the source file.
+# - $1 is the object file to create.
+CC_CMD=$(CC) -o $1 -c $< $(CPPFLAGS) $(GLOBAL_CFLAGS) $(CFLAGS) $($1_CFLAGS) -MMD -MF $(call filename-to-dep,$1) -MP
+CXX_CMD=$(CXX) -o $1 -c $< $(CPPFLAGS) $(GLOBAL_CXXFLAGS_PCH) $(GLOBAL_CXXFLAGS) $(CXXFLAGS) $($1_CXXFLAGS) $(ERROR_SWITCH_ENUM) -MMD -MF $(call filename-to-dep,$1) -MP
+
+# We use COMPILE_COMMANDS_JSON_CMD to turn a compilation command (like CC_CMD
+# or CXX_CMD above) into a comple_commands.json file. We rely on bash native
+# word splitting to define the positional arguments.
+# - $< is the source file being compiled.
+COMPILE_COMMANDS_JSON_CMD=jq --null-input '{ directory: $$ENV.PWD, file: "$<", arguments: $$ARGS.positional }' --args --
+
+
$(buildprefix)%.o: %.cc
@mkdir -p "$(dir $@)"
- $(trace-cxx) $(CXX) -o $@ -c $< $(CPPFLAGS) $(GLOBAL_CXXFLAGS_PCH) $(GLOBAL_CXXFLAGS) $(CXXFLAGS) $($@_CXXFLAGS) $(ERROR_SWITCH_ENUM) -MMD -MF $(call filename-to-dep, $@) -MP
+ $(trace-cxx) $(call CXX_CMD,$@)
$(buildprefix)%.o: %.cpp
@mkdir -p "$(dir $@)"
- $(trace-cxx) $(CXX) -o $@ -c $< $(CPPFLAGS) $(GLOBAL_CXXFLAGS_PCH) $(GLOBAL_CXXFLAGS) $(CXXFLAGS) $($@_CXXFLAGS) $(ERROR_SWITCH_ENUM) -MMD -MF $(call filename-to-dep, $@) -MP
+ $(trace-cxx) $(call CXX_CMD,$@)
$(buildprefix)%.o: %.c
@mkdir -p "$(dir $@)"
- $(trace-cc) $(CC) -o $@ -c $< $(CPPFLAGS) $(GLOBAL_CFLAGS) $(CFLAGS) $($@_CFLAGS) -MMD -MF $(call filename-to-dep, $@) -MP
+ $(trace-cc) $(call CC_CMD,$@)
+
+# In the following we need to replace the .compile_commands.json extension in $@ with .o
+# to make the object file. This is needed because CC_CMD and CXX_CMD do further expansions
+# based on the object file name (i.e. *_CXXFLAGS and filename-to-dep).
+
+$(buildprefix)%.compile_commands.json: %.cc
+ @mkdir -p "$(dir $@)"
+ $(trace-jq) $(COMPILE_COMMANDS_JSON_CMD) $(call CXX_CMD,$(@:.compile_commands.json=.o)) > $@
+
+$(buildprefix)%.compile_commands.json: %.cpp
+ @mkdir -p "$(dir $@)"
+ $(trace-jq) $(COMPILE_COMMANDS_JSON_CMD) $(call CXX_CMD,$(@:.compile_commands.json=.o)) > $@
+
+$(buildprefix)%.compile_commands.json: %.c
+ @mkdir -p "$(dir $@)"
+ $(trace-jq) $(COMPILE_COMMANDS_JSON_CMD) $(call CC_CMD,$(@:.compile_commands.json=.o)) > $@
diff --git a/mk/platform.mk b/mk/platform.mk
new file mode 100644
index 000000000..fe960dedf
--- /dev/null
+++ b/mk/platform.mk
@@ -0,0 +1,32 @@
+ifdef HOST_OS
+ HOST_KERNEL = $(firstword $(subst -, ,$(HOST_OS)))
+ ifeq ($(patsubst mingw%,,$(HOST_KERNEL)),)
+ HOST_MINGW = 1
+ HOST_WINDOWS = 1
+ endif
+ ifeq ($(HOST_KERNEL), cygwin)
+ HOST_CYGWIN = 1
+ HOST_WINDOWS = 1
+ HOST_UNIX = 1
+ endif
+ ifeq ($(patsubst darwin%,,$(HOST_KERNEL)),)
+ HOST_DARWIN = 1
+ HOST_UNIX = 1
+ endif
+ ifeq ($(patsubst freebsd%,,$(HOST_KERNEL)),)
+ HOST_FREEBSD = 1
+ HOST_UNIX = 1
+ endif
+ ifeq ($(patsubst netbsd%,,$(HOST_KERNEL)),)
+ HOST_NETBSD = 1
+ HOST_UNIX = 1
+ endif
+ ifeq ($(HOST_KERNEL), linux)
+ HOST_LINUX = 1
+ HOST_UNIX = 1
+ endif
+ ifeq ($(patsubst solaris%,,$(HOST_KERNEL)),)
+ HOST_SOLARIS = 1
+ HOST_UNIX = 1
+ endif
+endif
diff --git a/mk/programs.mk b/mk/programs.mk
index 6235311e9..623caaf55 100644
--- a/mk/programs.mk
+++ b/mk/programs.mk
@@ -1,5 +1,11 @@
programs-list :=
+ifdef HOST_WINDOWS
+ EXE_EXT = .exe
+else
+ EXE_EXT =
+endif
+
# Build a program with symbolic name $(1). The program is defined by
# various variables prefixed by ‘$(1)_’:
#
@@ -31,7 +37,7 @@ define build-program
_srcs := $$(sort $$(foreach src, $$($(1)_SOURCES), $$(src)))
$(1)_OBJS := $$(addprefix $(buildprefix), $$(addsuffix .o, $$(basename $$(_srcs))))
_libs := $$(foreach lib, $$($(1)_LIBS), $$(foreach lib2, $$($$(lib)_LIB_CLOSURE), $$($$(lib2)_PATH)))
- $(1)_PATH := $$(_d)/$$($(1)_NAME)
+ $(1)_PATH := $$(_d)/$$($(1)_NAME)$(EXE_EXT)
$$(eval $$(call create-dir, $$(_d)))
@@ -42,7 +48,7 @@ define build-program
ifdef $(1)_INSTALL_DIR
- $(1)_INSTALL_PATH := $$($(1)_INSTALL_DIR)/$$($(1)_NAME)
+ $(1)_INSTALL_PATH := $$($(1)_INSTALL_DIR)/$$($(1)_NAME)$(EXE_EXT)
$$(eval $$(call create-dir, $$($(1)_INSTALL_DIR)))
diff --git a/mk/templates.mk b/mk/templates.mk
index 866bdc17f..d5dae61c7 100644
--- a/mk/templates.mk
+++ b/mk/templates.mk
@@ -10,10 +10,10 @@ endef
ifneq ($(MAKECMDGOALS), clean)
-$(buildprefix)%.h: %.h.in
+$(buildprefix)%.h: %.h.in $(buildprefix)config.status
$(trace-gen) rm -f $@ && cd $(buildprefixrel) && ./config.status --quiet --header=$(@:$(buildprefix)%=%)
-$(buildprefix)%: %.in
+$(buildprefix)%: %.in $(buildprefix)config.status
$(trace-gen) rm -f $@ && cd $(buildprefixrel) && ./config.status --quiet --file=$(@:$(buildprefix)%=%)
endif
diff --git a/mk/tracing.mk b/mk/tracing.mk
index 1fc5573d7..09db1e617 100644
--- a/mk/tracing.mk
+++ b/mk/tracing.mk
@@ -10,6 +10,8 @@ ifeq ($(V), 0)
trace-install = @echo " INST " $@;
trace-mkdir = @echo " MKDIR " $@;
trace-test = @echo " TEST " $@;
+ trace-sh = @echo " SH " $@;
+ trace-jq = @echo " JQ " $@;
suppress = @
diff --git a/package.nix b/package.nix
new file mode 100644
index 000000000..7d9a39771
--- /dev/null
+++ b/package.nix
@@ -0,0 +1,397 @@
+{ lib
+, stdenv
+, releaseTools
+, autoconf-archive
+, autoreconfHook
+, aws-sdk-cpp
+, boehmgc
+, nlohmann_json
+, bison
+, boost
+, brotli
+, bzip2
+, curl
+, editline
+, readline
+, fileset
+, flex
+, git
+, gtest
+, jq
+, doxygen
+, libarchive
+, libcpuid
+, libgit2
+, libseccomp
+, libsodium
+, man
+, lowdown
+, mdbook
+, mdbook-linkcheck
+, mercurial
+, openssh
+, openssl
+, pkg-config
+, rapidcheck
+, sqlite
+, util-linux
+, xz
+
+, busybox-sandbox-shell ? null
+
+# Configuration Options
+#:
+# This probably seems like too many degrees of freedom, but it
+# faithfully reflects how the underlying configure + make build system
+# work. The top-level flake.nix will choose useful combinations of these
+# options to CI.
+
+, pname ? "nix"
+
+, versionSuffix ? ""
+, officialRelease ? false
+
+# Whether to build Nix. Useful to skip for tasks like (a) just
+# generating API docs or (b) testing existing pre-built versions of Nix
+, doBuild ? true
+
+# Run the unit tests as part of the build. See `installUnitTests` for an
+# alternative to this.
+, doCheck ? __forDefaults.canRunInstalled
+
+# Run the functional tests as part of the build.
+, doInstallCheck ? test-client != null || __forDefaults.canRunInstalled
+
+# Check test coverage of Nix. Probably want to use with with at least
+# one of `doCHeck` or `doInstallCheck` enabled.
+, withCoverageChecks ? false
+
+# Whether to build the regular manual
+, enableManual ? __forDefaults.canRunInstalled
+
+# Whether to use garbage collection for the Nix language evaluator.
+#
+# If it is disabled, we just leak memory, but this is not as bad as it
+# sounds so long as evaluation just takes places within short-lived
+# processes. (When the process exits, the memory is reclaimed; it is
+# only leaked *within* the process.)
+, enableGC ? true
+
+# Whether to enable Markdown rendering in the Nix binary.
+, enableMarkdown ? !stdenv.hostPlatform.isWindows
+
+# Which interactive line editor library to use for Nix's repl.
+#
+# Currently supported choices are:
+#
+# - editline (default)
+# - readline
+, readlineFlavor ? if stdenv.hostPlatform.isWindows then "readline" else "editline"
+
+# Whether to build the internal API docs, can be done separately from
+# everything else.
+, enableInternalAPIDocs ? false
+
+# Whether to install unit tests. This is useful when cross compiling
+# since we cannot run them natively during the build, but can do so
+# later.
+, installUnitTests ? doBuild && !__forDefaults.canExecuteHost
+
+# For running the functional tests against a pre-built Nix. Probably
+# want to use in conjunction with `doBuild = false;`.
+, test-daemon ? null
+, test-client ? null
+
+# Avoid setting things that would interfere with a functioning devShell
+, forDevShell ? false
+
+# Not a real argument, just the only way to approximate let-binding some
+# stuff for argument defaults.
+, __forDefaults ? {
+ canExecuteHost = stdenv.buildPlatform.canExecute stdenv.hostPlatform;
+ canRunInstalled = doBuild && __forDefaults.canExecuteHost;
+ }
+}:
+
+let
+ version = lib.fileContents ./.version + versionSuffix;
+
+ # selected attributes with defaults, will be used to define some
+ # things which should instead be gotten via `finalAttrs` in order to
+ # work with overriding.
+ attrs = {
+ inherit doBuild doCheck doInstallCheck;
+ };
+
+ mkDerivation =
+ if withCoverageChecks
+ then
+ # TODO support `finalAttrs` args function in
+ # `releaseTools.coverageAnalysis`.
+ argsFun:
+ releaseTools.coverageAnalysis (let args = argsFun args; in args)
+ else stdenv.mkDerivation;
+in
+
+mkDerivation (finalAttrs: let
+
+ inherit (finalAttrs)
+ doCheck
+ doInstallCheck
+ ;
+
+ doBuild = !finalAttrs.dontBuild;
+
+ # Either running the unit tests during the build, or installing them
+ # to be run later, requiresthe unit tests to be built.
+ buildUnitTests = doCheck || installUnitTests;
+
+in {
+ inherit pname version;
+
+ src =
+ let
+ baseFiles = fileset.fileFilter (f: f.name != ".gitignore") ./.;
+ in
+ fileset.toSource {
+ root = ./.;
+ fileset = fileset.intersection baseFiles (fileset.unions ([
+ # For configure
+ ./.version
+ ./configure.ac
+ ./m4
+ # TODO: do we really need README.md? It doesn't seem used in the build.
+ ./README.md
+ # For make, regardless of what we are building
+ ./local.mk
+ ./Makefile
+ ./Makefile.config.in
+ ./mk
+ (fileset.fileFilter (f: lib.strings.hasPrefix "nix-profile" f.name) ./scripts)
+ ] ++ lib.optionals doBuild [
+ ./doc
+ ./misc
+ ./precompiled-headers.h
+ ./src
+ ./COPYING
+ ./scripts/local.mk
+ ] ++ lib.optionals buildUnitTests [
+ ./doc/manual
+ ] ++ lib.optionals enableInternalAPIDocs [
+ ./doc/internal-api
+ # Source might not be compiled, but still must be available
+ # for Doxygen to gather comments.
+ ./src
+ ./tests/unit
+ ] ++ lib.optionals buildUnitTests [
+ ./tests/unit
+ ] ++ lib.optionals doInstallCheck [
+ ./tests/functional
+ ]));
+ };
+
+ VERSION_SUFFIX = versionSuffix;
+
+ outputs = [ "out" ]
+ ++ lib.optional doBuild "dev"
+ # If we are doing just build or just docs, the one thing will use
+ # "out". We only need additional outputs if we are doing both.
+ ++ lib.optional (doBuild && (enableManual || enableInternalAPIDocs)) "doc"
+ ++ lib.optional installUnitTests "check";
+
+ nativeBuildInputs = [
+ autoconf-archive
+ autoreconfHook
+ pkg-config
+ ] ++ lib.optionals doBuild [
+ bison
+ flex
+ ] ++ lib.optionals enableManual [
+ (lib.getBin lowdown)
+ mdbook
+ mdbook-linkcheck
+ ] ++ lib.optionals doInstallCheck [
+ git
+ mercurial
+ openssh
+ man # for testing `nix-* --help`
+ ] ++ lib.optionals (doInstallCheck || enableManual) [
+ jq # Also for custom mdBook preprocessor.
+ ] ++ lib.optional stdenv.hostPlatform.isLinux util-linux
+ ++ lib.optional enableInternalAPIDocs doxygen
+ ;
+
+ buildInputs = lib.optionals doBuild [
+ boost
+ brotli
+ bzip2
+ curl
+ libarchive
+ libgit2
+ libsodium
+ openssl
+ sqlite
+ xz
+ ({ inherit readline editline; }.${readlineFlavor})
+ ] ++ lib.optionals enableMarkdown [
+ lowdown
+ ] ++ lib.optionals buildUnitTests [
+ gtest
+ rapidcheck
+ ] ++ lib.optional stdenv.isLinux libseccomp
+ ++ lib.optional stdenv.hostPlatform.isx86_64 libcpuid
+ # There have been issues building these dependencies
+ ++ lib.optional (stdenv.hostPlatform == stdenv.buildPlatform && (stdenv.isLinux || stdenv.isDarwin))
+ (aws-sdk-cpp.override {
+ apis = ["s3" "transfer"];
+ customMemoryManagement = false;
+ })
+ ;
+
+ propagatedBuildInputs = [
+ nlohmann_json
+ ] ++ lib.optional enableGC boehmgc;
+
+ dontBuild = !attrs.doBuild;
+ doCheck = attrs.doCheck;
+
+ disallowedReferences = [ boost ];
+
+ preConfigure = lib.optionalString (doBuild && ! stdenv.hostPlatform.isStatic) (
+ ''
+ # Copy libboost_context so we don't get all of Boost in our closure.
+ # https://github.com/NixOS/nixpkgs/issues/45462
+ mkdir -p $out/lib
+ cp -pd ${boost}/lib/{libboost_context*,libboost_thread*,libboost_system*} $out/lib
+ rm -f $out/lib/*.a
+ '' + lib.optionalString stdenv.hostPlatform.isLinux ''
+ chmod u+w $out/lib/*.so.*
+ patchelf --set-rpath $out/lib:${stdenv.cc.cc.lib}/lib $out/lib/libboost_thread.so.*
+ '' + lib.optionalString stdenv.hostPlatform.isDarwin ''
+ for LIB in $out/lib/*.dylib; do
+ chmod u+w $LIB
+ install_name_tool -id $LIB $LIB
+ install_name_tool -delete_rpath ${boost}/lib/ $LIB || true
+ done
+ install_name_tool -change ${boost}/lib/libboost_system.dylib $out/lib/libboost_system.dylib $out/lib/libboost_thread.dylib
+ ''
+ );
+
+ configureFlags = [
+ (lib.enableFeature doBuild "build")
+ (lib.enableFeature buildUnitTests "unit-tests")
+ (lib.enableFeature doInstallCheck "functional-tests")
+ (lib.enableFeature enableInternalAPIDocs "internal-api-docs")
+ (lib.enableFeature enableManual "doc-gen")
+ (lib.enableFeature enableGC "gc")
+ (lib.enableFeature enableMarkdown "markdown")
+ (lib.enableFeature installUnitTests "install-unit-tests")
+ (lib.withFeatureAs true "readline-flavor" readlineFlavor)
+ ] ++ lib.optionals (!forDevShell) [
+ "--sysconfdir=/etc"
+ ] ++ lib.optionals installUnitTests [
+ "--with-check-bin-dir=${builtins.placeholder "check"}/bin"
+ "--with-check-lib-dir=${builtins.placeholder "check"}/lib"
+ ] ++ lib.optionals (doBuild) [
+ "--with-boost=${boost}/lib"
+ ] ++ lib.optionals (doBuild && stdenv.isLinux) [
+ "--with-sandbox-shell=${busybox-sandbox-shell}/bin/busybox"
+ ] ++ lib.optional (doBuild && stdenv.isLinux && !(stdenv.hostPlatform.isStatic && stdenv.system == "aarch64-linux"))
+ "LDFLAGS=-fuse-ld=gold"
+ ++ lib.optional (doBuild && stdenv.hostPlatform.isStatic) "--enable-embedded-sandbox-shell"
+ ;
+
+ enableParallelBuilding = true;
+
+ makeFlags = "profiledir=$(out)/etc/profile.d PRECOMPILE_HEADERS=1";
+
+ installTargets = lib.optional doBuild "install"
+ ++ lib.optional enableInternalAPIDocs "internal-api-html";
+
+ installFlags = "sysconfdir=$(out)/etc";
+
+ # In this case we are probably just running tests, and so there isn't
+ # anything to install, we just make an empty directory to signify tests
+ # succeeded.
+ installPhase = if finalAttrs.installTargets != [] then null else ''
+ mkdir -p $out
+ '';
+
+ postInstall = lib.optionalString doBuild (
+ lib.optionalString stdenv.hostPlatform.isStatic ''
+ mkdir -p $out/nix-support
+ echo "file binary-dist $out/bin/nix" >> $out/nix-support/hydra-build-products
+ '' + lib.optionalString stdenv.isDarwin ''
+ install_name_tool \
+ -change ${boost}/lib/libboost_context.dylib \
+ $out/lib/libboost_context.dylib \
+ $out/lib/libnixutil.dylib
+ ''
+ ) + lib.optionalString enableManual ''
+ mkdir -p ''${!outputDoc}/nix-support
+ echo "doc manual ''${!outputDoc}/share/doc/nix/manual" >> ''${!outputDoc}/nix-support/hydra-build-products
+ '' + lib.optionalString enableInternalAPIDocs ''
+ mkdir -p ''${!outputDoc}/nix-support
+ echo "doc internal-api-docs $out/share/doc/nix/internal-api/html" >> ''${!outputDoc}/nix-support/hydra-build-products
+ '';
+
+ doInstallCheck = attrs.doInstallCheck;
+
+ installCheckFlags = "sysconfdir=$(out)/etc";
+ # Work around buggy detection in stdenv.
+ installCheckTarget = "installcheck";
+
+ # Work around weird bug where it doesn't think there is a Makefile.
+ installCheckPhase = if (!doBuild && doInstallCheck) then ''
+ runHook preInstallCheck
+ mkdir -p src/nix-channel
+ make installcheck -j$NIX_BUILD_CORES -l$NIX_BUILD_CORES
+ '' else null;
+
+ # Needed for tests if we are not doing a build, but testing existing
+ # built Nix.
+ preInstallCheck =
+ lib.optionalString (! doBuild) ''
+ mkdir -p src/nix-channel
+ ''
+ # See https://github.com/NixOS/nix/issues/2523
+ # Occurs often in tests since https://github.com/NixOS/nix/pull/9900
+ + lib.optionalString stdenv.hostPlatform.isDarwin ''
+ export OBJC_DISABLE_INITIALIZE_FORK_SAFETY=YES
+ '';
+
+ separateDebugInfo = !stdenv.hostPlatform.isStatic;
+
+ # TODO `releaseTools.coverageAnalysis` in Nixpkgs needs to be updated
+ # to work with `strictDeps`.
+ strictDeps = !withCoverageChecks;
+
+ hardeningDisable = lib.optional stdenv.hostPlatform.isStatic "pie";
+
+ meta = {
+ platforms = lib.platforms.unix ++ lib.platforms.windows;
+ mainProgram = "nix";
+ broken = !(lib.all (a: a) [
+ # We cannot run or install unit tests if we don't build them or
+ # Nix proper (which they depend on).
+ (installUnitTests -> doBuild)
+ (doCheck -> doBuild)
+ # The build process for the manual currently requires extracting
+ # data from the Nix executable we are trying to document.
+ (enableManual -> doBuild)
+ ]);
+ };
+
+} // lib.optionalAttrs withCoverageChecks {
+ lcovFilter = [ "*/boost/*" "*-tab.*" ];
+
+ hardeningDisable = ["fortify"];
+
+ NIX_CFLAGS_COMPILE = "-DCOVERAGE=1";
+
+ dontInstall = false;
+} // lib.optionalAttrs (test-daemon != null) {
+ NIX_DAEMON_PACKAGE = test-daemon;
+} // lib.optionalAttrs (test-client != null) {
+ NIX_CLIENT_PACKAGE = test-client;
+})
diff --git a/perl/.yath.rc b/perl/.yath.rc
new file mode 100644
index 000000000..118bf80c8
--- /dev/null
+++ b/perl/.yath.rc
@@ -0,0 +1,2 @@
+[test]
+-I=rel(lib/Nix)
diff --git a/perl/default.nix b/perl/default.nix
index 4687976a1..7103574c9 100644
--- a/perl/default.nix
+++ b/perl/default.nix
@@ -5,12 +5,12 @@
, nix, curl, bzip2, xz, boost, libsodium, darwin
}:
-perl.pkgs.toPerlModule (stdenv.mkDerivation {
+perl.pkgs.toPerlModule (stdenv.mkDerivation (finalAttrs: {
name = "nix-perl-${nix.version}";
src = fileset.toSource {
root = ../.;
- fileset = fileset.unions [
+ fileset = fileset.unions ([
../.version
../m4
../mk
@@ -20,7 +20,10 @@ perl.pkgs.toPerlModule (stdenv.mkDerivation {
./configure.ac
./lib
./local.mk
- ];
+ ] ++ lib.optionals finalAttrs.doCheck [
+ ./.yath.rc
+ ./t
+ ]);
};
nativeBuildInputs =
@@ -40,6 +43,13 @@ perl.pkgs.toPerlModule (stdenv.mkDerivation {
++ lib.optional (stdenv.isLinux || stdenv.isDarwin) libsodium
++ lib.optional stdenv.isDarwin darwin.apple_sdk.frameworks.Security;
+ # `perlPackages.Test2Harness` is marked broken for Darwin
+ doCheck = !stdenv.isDarwin;
+
+ nativeCheckInputs = [
+ perlPackages.Test2Harness
+ ];
+
configureFlags = [
"--with-dbi=${perlPackages.DBI}/${perl.libPrefix}"
"--with-dbd-sqlite=${perlPackages.DBDSQLite}/${perl.libPrefix}"
@@ -48,4 +58,4 @@ perl.pkgs.toPerlModule (stdenv.mkDerivation {
enableParallelBuilding = true;
postUnpack = "sourceRoot=$sourceRoot/perl";
-})
+}))
diff --git a/perl/lib/Nix/Store.pm b/perl/lib/Nix/Store.pm
index 3e4bbee0a..16f2e17c8 100644
--- a/perl/lib/Nix/Store.pm
+++ b/perl/lib/Nix/Store.pm
@@ -12,17 +12,20 @@ our %EXPORT_TAGS = ( 'all' => [ qw( ) ] );
our @EXPORT_OK = ( @{ $EXPORT_TAGS{'all'} } );
our @EXPORT = qw(
- setVerbosity
- isValidPath queryReferences queryPathInfo queryDeriver queryPathHash
- queryPathFromHashPart
- topoSortPaths computeFSClosure followLinksToStorePath exportPaths importPaths
+ StoreWrapper
+ StoreWrapper::new
+ StoreWrapper::isValidPath StoreWrapper::queryReferences StoreWrapper::queryPathInfo StoreWrapper::queryDeriver StoreWrapper::queryPathHash
+ StoreWrapper::queryPathFromHashPart
+ StoreWrapper::topoSortPaths StoreWrapper::computeFSClosure followLinksToStorePath StoreWrapper::exportPaths StoreWrapper::importPaths
+ StoreWrapper::addToStore StoreWrapper::makeFixedOutputPath
+ StoreWrapper::derivationFromPath
+ StoreWrapper::addTempRoot
+ StoreWrapper::queryRawRealisation
+
hashPath hashFile hashString convertHash
signString checkSignature
- addToStore makeFixedOutputPath
- derivationFromPath
- addTempRoot
getBinDir getStoreDir
- queryRawRealisation
+ setVerbosity
);
our $VERSION = '0.15';
diff --git a/perl/lib/Nix/Store.xs b/perl/lib/Nix/Store.xs
index 40257ed74..1c64cc66b 100644
--- a/perl/lib/Nix/Store.xs
+++ b/perl/lib/Nix/Store.xs
@@ -12,52 +12,66 @@
#include "realisation.hh"
#include "globals.hh"
#include "store-api.hh"
-#include "crypto.hh"
+#include "posix-source-accessor.hh"
#include
#include
-
using namespace nix;
+static bool libStoreInitialized = false;
-static ref store()
-{
- static std::shared_ptr _store;
- if (!_store) {
- try {
- initLibStore();
- _store = openStore();
- } catch (Error & e) {
- croak("%s", e.what());
- }
- }
- return ref(_store);
-}
-
+struct StoreWrapper {
+ ref store;
+};
MODULE = Nix::Store PACKAGE = Nix::Store
PROTOTYPES: ENABLE
+TYPEMAP: < _store;
try {
- RETVAL = store()->isValidPath(store()->parseStorePath(path));
+ if (!libStoreInitialized) {
+ initLibStore();
+ libStoreInitialized = true;
+ }
+ if (items == 1) {
+ _store = openStore();
+ RETVAL = new StoreWrapper {
+ .store = ref{_store}
+ };
+ } else {
+ RETVAL = new StoreWrapper {
+ .store = openStore(s)
+ };
+ }
} catch (Error & e) {
croak("%s", e.what());
}
@@ -65,52 +79,81 @@ int isValidPath(char * path)
RETVAL
-SV * queryReferences(char * path)
+void init()
+ CODE:
+ if (!libStoreInitialized) {
+ initLibStore();
+ libStoreInitialized = true;
+ }
+
+
+void setVerbosity(int level)
+ CODE:
+ verbosity = (Verbosity) level;
+
+
+int
+StoreWrapper::isValidPath(char * path)
+ CODE:
+ try {
+ RETVAL = THIS->store->isValidPath(THIS->store->parseStorePath(path));
+ } catch (Error & e) {
+ croak("%s", e.what());
+ }
+ OUTPUT:
+ RETVAL
+
+
+SV *
+StoreWrapper::queryReferences(char * path)
PPCODE:
try {
- for (auto & i : store()->queryPathInfo(store()->parseStorePath(path))->references)
- XPUSHs(sv_2mortal(newSVpv(store()->printStorePath(i).c_str(), 0)));
+ for (auto & i : THIS->store->queryPathInfo(THIS->store->parseStorePath(path))->references)
+ XPUSHs(sv_2mortal(newSVpv(THIS->store->printStorePath(i).c_str(), 0)));
} catch (Error & e) {
croak("%s", e.what());
}
-SV * queryPathHash(char * path)
+SV *
+StoreWrapper::queryPathHash(char * path)
PPCODE:
try {
- auto s = store()->queryPathInfo(store()->parseStorePath(path))->narHash.to_string(HashFormat::Base32, true);
+ auto s = THIS->store->queryPathInfo(THIS->store->parseStorePath(path))->narHash.to_string(HashFormat::Nix32, true);
XPUSHs(sv_2mortal(newSVpv(s.c_str(), 0)));
} catch (Error & e) {
croak("%s", e.what());
}
-SV * queryDeriver(char * path)
+SV *
+StoreWrapper::queryDeriver(char * path)
PPCODE:
try {
- auto info = store()->queryPathInfo(store()->parseStorePath(path));
+ auto info = THIS->store->queryPathInfo(THIS->store->parseStorePath(path));
if (!info->deriver) XSRETURN_UNDEF;
- XPUSHs(sv_2mortal(newSVpv(store()->printStorePath(*info->deriver).c_str(), 0)));
+ XPUSHs(sv_2mortal(newSVpv(THIS->store->printStorePath(*info->deriver).c_str(), 0)));
} catch (Error & e) {
croak("%s", e.what());
}
-SV * queryPathInfo(char * path, int base32)
+SV *
+StoreWrapper::queryPathInfo(char * path, int base32)
PPCODE:
try {
- auto info = store()->queryPathInfo(store()->parseStorePath(path));
+ auto info = THIS->store->queryPathInfo(THIS->store->parseStorePath(path));
if (!info->deriver)
XPUSHs(&PL_sv_undef);
else
- XPUSHs(sv_2mortal(newSVpv(store()->printStorePath(*info->deriver).c_str(), 0)));
- auto s = info->narHash.to_string(base32 ? HashFormat::Base32 : HashFormat::Base16, true);
+ XPUSHs(sv_2mortal(newSVpv(THIS->store->printStorePath(*info->deriver).c_str(), 0)));
+ auto s = info->narHash.to_string(base32 ? HashFormat::Nix32 : HashFormat::Base16, true);
XPUSHs(sv_2mortal(newSVpv(s.c_str(), 0)));
mXPUSHi(info->registrationTime);
mXPUSHi(info->narSize);
AV * refs = newAV();
for (auto & i : info->references)
- av_push(refs, newSVpv(store()->printStorePath(i).c_str(), 0));
+ av_push(refs, newSVpv(THIS->store->printStorePath(i).c_str(), 0));
XPUSHs(sv_2mortal(newRV((SV *) refs)));
AV * sigs = newAV();
for (auto & i : info->sigs)
@@ -120,10 +163,11 @@ SV * queryPathInfo(char * path, int base32)
croak("%s", e.what());
}
-SV * queryRawRealisation(char * outputId)
+SV *
+StoreWrapper::queryRawRealisation(char * outputId)
PPCODE:
try {
- auto realisation = store()->queryRealisation(DrvOutput::parse(outputId));
+ auto realisation = THIS->store->queryRealisation(DrvOutput::parse(outputId));
if (realisation)
XPUSHs(sv_2mortal(newSVpv(realisation->toJSON().dump().c_str(), 0)));
else
@@ -133,46 +177,50 @@ SV * queryRawRealisation(char * outputId)
}
-SV * queryPathFromHashPart(char * hashPart)
+SV *
+StoreWrapper::queryPathFromHashPart(char * hashPart)
PPCODE:
try {
- auto path = store()->queryPathFromHashPart(hashPart);
- XPUSHs(sv_2mortal(newSVpv(path ? store()->printStorePath(*path).c_str() : "", 0)));
+ auto path = THIS->store->queryPathFromHashPart(hashPart);
+ XPUSHs(sv_2mortal(newSVpv(path ? THIS->store->printStorePath(*path).c_str() : "", 0)));
} catch (Error & e) {
croak("%s", e.what());
}
-SV * computeFSClosure(int flipDirection, int includeOutputs, ...)
+SV *
+StoreWrapper::computeFSClosure(int flipDirection, int includeOutputs, ...)
PPCODE:
try {
StorePathSet paths;
for (int n = 2; n < items; ++n)
- store()->computeFSClosure(store()->parseStorePath(SvPV_nolen(ST(n))), paths, flipDirection, includeOutputs);
+ THIS->store->computeFSClosure(THIS->store->parseStorePath(SvPV_nolen(ST(n))), paths, flipDirection, includeOutputs);
for (auto & i : paths)
- XPUSHs(sv_2mortal(newSVpv(store()->printStorePath(i).c_str(), 0)));
+ XPUSHs(sv_2mortal(newSVpv(THIS->store->printStorePath(i).c_str(), 0)));
} catch (Error & e) {
croak("%s", e.what());
}
-SV * topoSortPaths(...)
+SV *
+StoreWrapper::topoSortPaths(...)
PPCODE:
try {
StorePathSet paths;
- for (int n = 0; n < items; ++n) paths.insert(store()->parseStorePath(SvPV_nolen(ST(n))));
- auto sorted = store()->topoSortPaths(paths);
+ for (int n = 0; n < items; ++n) paths.insert(THIS->store->parseStorePath(SvPV_nolen(ST(n))));
+ auto sorted = THIS->store->topoSortPaths(paths);
for (auto & i : sorted)
- XPUSHs(sv_2mortal(newSVpv(store()->printStorePath(i).c_str(), 0)));
+ XPUSHs(sv_2mortal(newSVpv(THIS->store->printStorePath(i).c_str(), 0)));
} catch (Error & e) {
croak("%s", e.what());
}
-SV * followLinksToStorePath(char * path)
+SV *
+StoreWrapper::followLinksToStorePath(char * path)
CODE:
try {
- RETVAL = newSVpv(store()->printStorePath(store()->followLinksToStorePath(path)).c_str(), 0);
+ RETVAL = newSVpv(THIS->store->printStorePath(THIS->store->followLinksToStorePath(path)).c_str(), 0);
} catch (Error & e) {
croak("%s", e.what());
}
@@ -180,33 +228,39 @@ SV * followLinksToStorePath(char * path)
RETVAL
-void exportPaths(int fd, ...)
+void
+StoreWrapper::exportPaths(int fd, ...)
PPCODE:
try {
StorePathSet paths;
- for (int n = 1; n < items; ++n) paths.insert(store()->parseStorePath(SvPV_nolen(ST(n))));
+ for (int n = 1; n < items; ++n) paths.insert(THIS->store->parseStorePath(SvPV_nolen(ST(n))));
FdSink sink(fd);
- store()->exportPaths(paths, sink);
+ THIS->store->exportPaths(paths, sink);
} catch (Error & e) {
croak("%s", e.what());
}
-void importPaths(int fd, int dontCheckSigs)
+void
+StoreWrapper::importPaths(int fd, int dontCheckSigs)
PPCODE:
try {
FdSource source(fd);
- store()->importPaths(source, dontCheckSigs ? NoCheckSigs : CheckSigs);
+ THIS->store->importPaths(source, dontCheckSigs ? NoCheckSigs : CheckSigs);
} catch (Error & e) {
croak("%s", e.what());
}
-SV * hashPath(char * algo, int base32, char * path)
+SV *
+hashPath(char * algo, int base32, char * path)
PPCODE:
try {
- Hash h = hashPath(parseHashType(algo), path).first;
- auto s = h.to_string(base32 ? HashFormat::Base32 : HashFormat::Base16, false);
+ auto [accessor, canonPath] = PosixSourceAccessor::createAtRoot(path);
+ Hash h = hashPath(
+ accessor, canonPath,
+ FileIngestionMethod::Recursive, parseHashAlgo(algo));
+ auto s = h.to_string(base32 ? HashFormat::Nix32 : HashFormat::Base16, false);
XPUSHs(sv_2mortal(newSVpv(s.c_str(), 0)));
} catch (Error & e) {
croak("%s", e.what());
@@ -216,8 +270,8 @@ SV * hashPath(char * algo, int base32, char * path)
SV * hashFile(char * algo, int base32, char * path)
PPCODE:
try {
- Hash h = hashFile(parseHashType(algo), path);
- auto s = h.to_string(base32 ? HashFormat::Base32 : HashFormat::Base16, false);
+ Hash h = hashFile(parseHashAlgo(algo), path);
+ auto s = h.to_string(base32 ? HashFormat::Nix32 : HashFormat::Base16, false);
XPUSHs(sv_2mortal(newSVpv(s.c_str(), 0)));
} catch (Error & e) {
croak("%s", e.what());
@@ -227,8 +281,8 @@ SV * hashFile(char * algo, int base32, char * path)
SV * hashString(char * algo, int base32, char * s)
PPCODE:
try {
- Hash h = hashString(parseHashType(algo), s);
- auto s = h.to_string(base32 ? HashFormat::Base32 : HashFormat::Base16, false);
+ Hash h = hashString(parseHashAlgo(algo), s);
+ auto s = h.to_string(base32 ? HashFormat::Nix32 : HashFormat::Base16, false);
XPUSHs(sv_2mortal(newSVpv(s.c_str(), 0)));
} catch (Error & e) {
croak("%s", e.what());
@@ -238,8 +292,8 @@ SV * hashString(char * algo, int base32, char * s)
SV * convertHash(char * algo, char * s, int toBase32)
PPCODE:
try {
- auto h = Hash::parseAny(s, parseHashType(algo));
- auto s = h.to_string(toBase32 ? HashFormat::Base32 : HashFormat::Base16, false);
+ auto h = Hash::parseAny(s, parseHashAlgo(algo));
+ auto s = h.to_string(toBase32 ? HashFormat::Nix32 : HashFormat::Base16, false);
XPUSHs(sv_2mortal(newSVpv(s.c_str(), 0)));
} catch (Error & e) {
croak("%s", e.what());
@@ -277,60 +331,67 @@ int checkSignature(SV * publicKey_, SV * sig_, char * msg)
RETVAL
-SV * addToStore(char * srcPath, int recursive, char * algo)
+SV *
+StoreWrapper::addToStore(char * srcPath, int recursive, char * algo)
PPCODE:
try {
auto method = recursive ? FileIngestionMethod::Recursive : FileIngestionMethod::Flat;
- auto path = store()->addToStore(std::string(baseNameOf(srcPath)), srcPath, method, parseHashType(algo));
- XPUSHs(sv_2mortal(newSVpv(store()->printStorePath(path).c_str(), 0)));
+ auto [accessor, canonPath] = PosixSourceAccessor::createAtRoot(srcPath);
+ auto path = THIS->store->addToStore(
+ std::string(baseNameOf(srcPath)),
+ accessor, canonPath,
+ method, parseHashAlgo(algo));
+ XPUSHs(sv_2mortal(newSVpv(THIS->store->printStorePath(path).c_str(), 0)));
} catch (Error & e) {
croak("%s", e.what());
}
-SV * makeFixedOutputPath(int recursive, char * algo, char * hash, char * name)
+SV *
+StoreWrapper::makeFixedOutputPath(int recursive, char * algo, char * hash, char * name)
PPCODE:
try {
- auto h = Hash::parseAny(hash, parseHashType(algo));
+ auto h = Hash::parseAny(hash, parseHashAlgo(algo));
auto method = recursive ? FileIngestionMethod::Recursive : FileIngestionMethod::Flat;
- auto path = store()->makeFixedOutputPath(name, FixedOutputInfo {
+ auto path = THIS->store->makeFixedOutputPath(name, FixedOutputInfo {
.method = method,
.hash = h,
.references = {},
});
- XPUSHs(sv_2mortal(newSVpv(store()->printStorePath(path).c_str(), 0)));
+ XPUSHs(sv_2mortal(newSVpv(THIS->store->printStorePath(path).c_str(), 0)));
} catch (Error & e) {
croak("%s", e.what());
}
-SV * derivationFromPath(char * drvPath)
+SV *
+StoreWrapper::derivationFromPath(char * drvPath)
PREINIT:
HV *hash;
CODE:
try {
- Derivation drv = store()->derivationFromPath(store()->parseStorePath(drvPath));
+ Derivation drv = THIS->store->derivationFromPath(THIS->store->parseStorePath(drvPath));
hash = newHV();
HV * outputs = newHV();
- for (auto & i : drv.outputsAndOptPaths(*store())) {
+ for (auto & i : drv.outputsAndOptPaths(*THIS->store)) {
hv_store(
outputs, i.first.c_str(), i.first.size(),
!i.second.second
? newSV(0) /* null value */
- : newSVpv(store()->printStorePath(*i.second.second).c_str(), 0),
+ : newSVpv(THIS->store->printStorePath(*i.second.second).c_str(), 0),
0);
}
hv_stores(hash, "outputs", newRV((SV *) outputs));
AV * inputDrvs = newAV();
for (auto & i : drv.inputDrvs.map)
- av_push(inputDrvs, newSVpv(store()->printStorePath(i.first).c_str(), 0)); // !!! ignores i->second
+ av_push(inputDrvs, newSVpv(THIS->store->printStorePath(i.first).c_str(), 0)); // !!! ignores i->second
hv_stores(hash, "inputDrvs", newRV((SV *) inputDrvs));
AV * inputSrcs = newAV();
for (auto & i : drv.inputSrcs)
- av_push(inputSrcs, newSVpv(store()->printStorePath(i).c_str(), 0));
+ av_push(inputSrcs, newSVpv(THIS->store->printStorePath(i).c_str(), 0));
hv_stores(hash, "inputSrcs", newRV((SV *) inputSrcs));
hv_stores(hash, "platform", newSVpv(drv.platform.c_str(), 0));
@@ -354,10 +415,11 @@ SV * derivationFromPath(char * drvPath)
RETVAL
-void addTempRoot(char * storePath)
+void
+StoreWrapper::addTempRoot(char * storePath)
PPCODE:
try {
- store()->addTempRoot(store()->parseStorePath(storePath));
+ THIS->store->addTempRoot(THIS->store->parseStorePath(storePath));
} catch (Error & e) {
croak("%s", e.what());
}
diff --git a/perl/local.mk b/perl/local.mk
index 0eae651d8..ed4764eb9 100644
--- a/perl/local.mk
+++ b/perl/local.mk
@@ -41,3 +41,6 @@ Store_FORCE_INSTALL = 1
Store_INSTALL_DIR = $(perllibdir)/auto/Nix/Store
clean-files += lib/Nix/Config.pm lib/Nix/Store.cc Makefile.config
+
+check: all
+ yath test
diff --git a/perl/t/init.t b/perl/t/init.t
new file mode 100644
index 000000000..80197e013
--- /dev/null
+++ b/perl/t/init.t
@@ -0,0 +1,13 @@
+use strict;
+use warnings;
+use Test2::V0;
+
+use Nix::Store;
+
+my $s = new Nix::Store("dummy://");
+
+my $res = $s->isValidPath("/nix/store/g1w7hy3qg1w7hy3qg1w7hy3qg1w7hy3q-bar");
+
+ok(!$res, "should not have path");
+
+done_testing;
diff --git a/scripts/binary-tarball.nix b/scripts/binary-tarball.nix
new file mode 100644
index 000000000..104189b0c
--- /dev/null
+++ b/scripts/binary-tarball.nix
@@ -0,0 +1,84 @@
+{ runCommand
+, system
+, buildPackages
+, cacert
+, nix
+}:
+
+let
+
+ installerClosureInfo = buildPackages.closureInfo {
+ rootPaths = [ nix cacert ];
+ };
+
+ inherit (nix) version;
+
+ env = {
+ #nativeBuildInputs = lib.optional (system != "aarch64-linux") shellcheck;
+ meta.description = "Distribution-independent Nix bootstrap binaries for ${system}";
+ };
+
+in
+
+runCommand "nix-binary-tarball-${version}" env ''
+ cp ${installerClosureInfo}/registration $TMPDIR/reginfo
+ cp ${./create-darwin-volume.sh} $TMPDIR/create-darwin-volume.sh
+ substitute ${./install-nix-from-closure.sh} $TMPDIR/install \
+ --subst-var-by nix ${nix} \
+ --subst-var-by cacert ${cacert}
+
+ substitute ${./install-darwin-multi-user.sh} $TMPDIR/install-darwin-multi-user.sh \
+ --subst-var-by nix ${nix} \
+ --subst-var-by cacert ${cacert}
+ substitute ${./install-systemd-multi-user.sh} $TMPDIR/install-systemd-multi-user.sh \
+ --subst-var-by nix ${nix} \
+ --subst-var-by cacert ${cacert}
+ substitute ${./install-multi-user.sh} $TMPDIR/install-multi-user \
+ --subst-var-by nix ${nix} \
+ --subst-var-by cacert ${cacert}
+
+ if type -p shellcheck; then
+ # SC1090: Don't worry about not being able to find
+ # $nix/etc/profile.d/nix.sh
+ shellcheck --exclude SC1090 $TMPDIR/install
+ shellcheck $TMPDIR/create-darwin-volume.sh
+ shellcheck $TMPDIR/install-darwin-multi-user.sh
+ shellcheck $TMPDIR/install-systemd-multi-user.sh
+
+ # SC1091: Don't panic about not being able to source
+ # /etc/profile
+ # SC2002: Ignore "useless cat" "error", when loading
+ # .reginfo, as the cat is a much cleaner
+ # implementation, even though it is "useless"
+ # SC2116: Allow ROOT_HOME=$(echo ~root) for resolving
+ # root's home directory
+ shellcheck --external-sources \
+ --exclude SC1091,SC2002,SC2116 $TMPDIR/install-multi-user
+ fi
+
+ chmod +x $TMPDIR/install
+ chmod +x $TMPDIR/create-darwin-volume.sh
+ chmod +x $TMPDIR/install-darwin-multi-user.sh
+ chmod +x $TMPDIR/install-systemd-multi-user.sh
+ chmod +x $TMPDIR/install-multi-user
+ dir=nix-${version}-${system}
+ fn=$out/$dir.tar.xz
+ mkdir -p $out/nix-support
+ echo "file binary-dist $fn" >> $out/nix-support/hydra-build-products
+ tar cvfJ $fn \
+ --owner=0 --group=0 --mode=u+rw,uga+r \
+ --mtime='1970-01-01' \
+ --absolute-names \
+ --hard-dereference \
+ --transform "s,$TMPDIR/install,$dir/install," \
+ --transform "s,$TMPDIR/create-darwin-volume.sh,$dir/create-darwin-volume.sh," \
+ --transform "s,$TMPDIR/reginfo,$dir/.reginfo," \
+ --transform "s,$NIX_STORE,$dir/store,S" \
+ $TMPDIR/install \
+ $TMPDIR/create-darwin-volume.sh \
+ $TMPDIR/install-darwin-multi-user.sh \
+ $TMPDIR/install-systemd-multi-user.sh \
+ $TMPDIR/install-multi-user \
+ $TMPDIR/reginfo \
+ $(cat ${installerClosureInfo}/store-paths)
+''
diff --git a/scripts/install-darwin-multi-user.sh b/scripts/install-darwin-multi-user.sh
index 0326d3415..24c9052f9 100644
--- a/scripts/install-darwin-multi-user.sh
+++ b/scripts/install-darwin-multi-user.sh
@@ -3,11 +3,13 @@
set -eu
set -o pipefail
+# System specific settings
+export NIX_FIRST_BUILD_UID="${NIX_FIRST_BUILD_UID:-301}"
+export NIX_BUILD_USER_NAME_TEMPLATE="_nixbld%d"
+
readonly NIX_DAEMON_DEST=/Library/LaunchDaemons/org.nixos.nix-daemon.plist
# create by default; set 0 to DIY, use a symlink, etc.
readonly NIX_VOLUME_CREATE=${NIX_VOLUME_CREATE:-1} # now default
-NIX_FIRST_BUILD_UID="301"
-NIX_BUILD_USER_NAME_TEMPLATE="_nixbld%d"
# caution: may update times on / if not run as normal non-root user
read_only_root() {
@@ -100,7 +102,7 @@ poly_extra_try_me_commands() {
poly_configure_nix_daemon_service() {
task "Setting up the nix-daemon LaunchDaemon"
_sudo "to set up the nix-daemon as a LaunchDaemon" \
- /usr/bin/install -m -rw-r--r-- "/nix/var/nix/profiles/default$NIX_DAEMON_DEST" "$NIX_DAEMON_DEST"
+ /usr/bin/install -m "u=rw,go=r" "/nix/var/nix/profiles/default$NIX_DAEMON_DEST" "$NIX_DAEMON_DEST"
_sudo "to load the LaunchDaemon plist for nix-daemon" \
launchctl load /Library/LaunchDaemons/org.nixos.nix-daemon.plist
diff --git a/scripts/install-multi-user.sh b/scripts/install-multi-user.sh
index a08f62333..ad3ee8881 100644
--- a/scripts/install-multi-user.sh
+++ b/scripts/install-multi-user.sh
@@ -25,9 +25,9 @@ readonly RED='\033[31m'
readonly NIX_USER_COUNT=${NIX_USER_COUNT:-32}
readonly NIX_BUILD_GROUP_ID="${NIX_BUILD_GROUP_ID:-30000}"
readonly NIX_BUILD_GROUP_NAME="nixbld"
-# darwin installer needs to override these
-NIX_FIRST_BUILD_UID="${NIX_FIRST_BUILD_UID:-30001}"
-NIX_BUILD_USER_NAME_TEMPLATE="nixbld%d"
+# each system specific installer must set these:
+# NIX_FIRST_BUILD_UID
+# NIX_BUILD_USER_NAME_TEMPLATE
# Please don't change this. We don't support it, because the
# default shell profile that comes with Nix doesn't support it.
readonly NIX_ROOT="/nix"
@@ -707,6 +707,12 @@ EOF
fi
}
+check_required_system_specific_settings() {
+ if [ -z "${NIX_FIRST_BUILD_UID+x}" ] || [ -z "${NIX_BUILD_USER_NAME_TEMPLATE+x}" ]; then
+ failure "Internal error: System specific installer for $(uname) ($1) does not export required settings."
+ fi
+}
+
welcome_to_nix() {
local -r NIX_UID_RANGES="${NIX_FIRST_BUILD_UID}..$((NIX_FIRST_BUILD_UID + NIX_USER_COUNT - 1))"
local -r RANGE_TEXT=$(echo -ne "${BLUE}(uids [${NIX_UID_RANGES}])${ESC}")
@@ -726,7 +732,9 @@ manager. This will happen in a few stages:
if you are ready to continue.
3. Create the system users ${RANGE_TEXT} and groups ${GROUP_TEXT}
- that the Nix daemon uses to run builds.
+ that the Nix daemon uses to run builds. To create system users
+ in a different range, exit and run this tool again with
+ NIX_FIRST_BUILD_UID set.
4. Perform the basic installation of the Nix files daemon.
@@ -968,13 +976,16 @@ main() {
if is_os_darwin; then
# shellcheck source=./install-darwin-multi-user.sh
. "$EXTRACTED_NIX_PATH/install-darwin-multi-user.sh"
+ check_required_system_specific_settings "install-darwin-multi-user.sh"
elif is_os_linux; then
# shellcheck source=./install-systemd-multi-user.sh
. "$EXTRACTED_NIX_PATH/install-systemd-multi-user.sh" # most of this works on non-systemd distros also
+ check_required_system_specific_settings "install-systemd-multi-user.sh"
else
failure "Sorry, I don't know what to do on $(uname)"
fi
+
welcome_to_nix
if ! is_root; then
diff --git a/scripts/install-systemd-multi-user.sh b/scripts/install-systemd-multi-user.sh
index 07b34033a..202a9bb54 100755
--- a/scripts/install-systemd-multi-user.sh
+++ b/scripts/install-systemd-multi-user.sh
@@ -3,6 +3,10 @@
set -eu
set -o pipefail
+# System specific settings
+export NIX_FIRST_BUILD_UID="${NIX_FIRST_BUILD_UID:-30001}"
+export NIX_BUILD_USER_NAME_TEMPLATE="nixbld%d"
+
readonly SERVICE_SRC=/lib/systemd/system/nix-daemon.service
readonly SERVICE_DEST=/etc/systemd/system/nix-daemon.service
diff --git a/scripts/installer.nix b/scripts/installer.nix
new file mode 100644
index 000000000..cc7759c2c
--- /dev/null
+++ b/scripts/installer.nix
@@ -0,0 +1,36 @@
+{ lib
+, runCommand
+, nix
+, tarballs
+}:
+
+runCommand "installer-script" {
+ buildInputs = [ nix ];
+} ''
+ mkdir -p $out/nix-support
+
+ # Converts /nix/store/50p3qk8k...-nix-2.4pre20201102_550e11f/bin/nix to 50p3qk8k.../bin/nix.
+ tarballPath() {
+ # Remove the store prefix
+ local path=''${1#${builtins.storeDir}/}
+ # Get the path relative to the derivation root
+ local rest=''${path#*/}
+ # Get the derivation hash
+ local drvHash=''${path%%-*}
+ echo "$drvHash/$rest"
+ }
+
+ substitute ${./install.in} $out/install \
+ ${lib.concatMapStrings
+ (tarball: let
+ inherit (tarball.stdenv.hostPlatform) system;
+ in '' \
+ --replace '@tarballHash_${system}@' $(nix --experimental-features nix-command hash-file --base16 --type sha256 ${tarball}/*.tar.xz) \
+ --replace '@tarballPath_${system}@' $(tarballPath ${tarball}/*.tar.xz) \
+ ''
+ )
+ tarballs
+ } --replace '@nixVersion@' ${nix.version}
+
+ echo "file installer $out/install" >> $out/nix-support/hydra-build-products
+''
diff --git a/scripts/nix-profile-daemon.fish.in b/scripts/nix-profile-daemon.fish.in
index c23aa64f0..346dce5dd 100644
--- a/scripts/nix-profile-daemon.fish.in
+++ b/scripts/nix-profile-daemon.fish.in
@@ -28,7 +28,7 @@ else
end
# Set $NIX_SSL_CERT_FILE so that Nixpkgs applications like curl work.
-if test -n "$NIX_SSH_CERT_FILE"
+if test -n "$NIX_SSL_CERT_FILE"
: # Allow users to override the NIX_SSL_CERT_FILE
else if test -e /etc/ssl/certs/ca-certificates.crt # NixOS, Ubuntu, Debian, Gentoo, Arch
set --export NIX_SSL_CERT_FILE /etc/ssl/certs/ca-certificates.crt
@@ -44,7 +44,7 @@ else if test -e "$NIX_LINK/etc/ca-bundle.crt" # old cacert in Nix profile
set --export NIX_SSL_CERT_FILE "$NIX_LINK/etc/ca-bundle.crt"
else
# Fall back to what is in the nix profiles, favouring whatever is defined last.
- for i in $NIX_PROFILES
+ for i in (string split ' ' $NIX_PROFILES)
if test -e "$i/etc/ssl/certs/ca-bundle.crt"
set --export NIX_SSL_CERT_FILE "$i/etc/ssl/certs/ca-bundle.crt"
end
diff --git a/src/build-remote/build-remote.cc b/src/build-remote/build-remote.cc
index d69d3a0c2..118468477 100644
--- a/src/build-remote/build-remote.cc
+++ b/src/build-remote/build-remote.cc
@@ -137,11 +137,8 @@ static int main_build_remote(int argc, char * * argv)
for (auto & m : machines) {
debug("considering building on remote machine '%s'", m.storeUri);
- if (m.enabled
- && (neededSystem == "builtin"
- || std::find(m.systemTypes.begin(),
- m.systemTypes.end(),
- neededSystem) != m.systemTypes.end()) &&
+ if (m.enabled &&
+ m.systemSupported(neededSystem) &&
m.allSupported(requiredFeatures) &&
m.mandatoryMet(requiredFeatures))
{
@@ -205,7 +202,7 @@ static int main_build_remote(int argc, char * * argv)
else
drvstr = "";
- auto error = hintformat(errorText);
+ auto error = HintFmt(errorText);
error
% drvstr
% neededSystem
@@ -214,7 +211,7 @@ static int main_build_remote(int argc, char * * argv)
for (auto & m : machines)
error
- % concatStringsSep>(", ", m.systemTypes)
+ % concatStringsSep(", ", m.systemTypes)
% m.maxJobs
% concatStringsSep(", ", m.supportedFeatures)
% concatStringsSep(", ", m.mandatoryFeatures);
diff --git a/src/libcmd/built-path.cc b/src/libcmd/built-path.cc
index 8e2efc7c3..c5eb93c5d 100644
--- a/src/libcmd/built-path.cc
+++ b/src/libcmd/built-path.cc
@@ -12,9 +12,9 @@ namespace nix {
bool MY_TYPE ::operator COMPARATOR (const MY_TYPE & other) const \
{ \
const MY_TYPE* me = this; \
- auto fields1 = std::make_tuple(*me->drvPath, me->FIELD); \
+ auto fields1 = std::tie(*me->drvPath, me->FIELD); \
me = &other; \
- auto fields2 = std::make_tuple(*me->drvPath, me->FIELD); \
+ auto fields2 = std::tie(*me->drvPath, me->FIELD); \
return fields1 COMPARATOR fields2; \
}
#define CMP(CHILD_TYPE, MY_TYPE, FIELD) \
diff --git a/src/libcmd/command.cc b/src/libcmd/command.cc
index de9f546fc..369fa6004 100644
--- a/src/libcmd/command.cc
+++ b/src/libcmd/command.cc
@@ -1,4 +1,5 @@
#include "command.hh"
+#include "markdown.hh"
#include "store-api.hh"
#include "local-fs-store.hh"
#include "derivations.hh"
@@ -34,6 +35,19 @@ nlohmann::json NixMultiCommand::toJSON()
return MultiCommand::toJSON();
}
+void NixMultiCommand::run()
+{
+ if (!command) {
+ std::set subCommandTextLines;
+ for (auto & [name, _] : commands)
+ subCommandTextLines.insert(fmt("- `%s`", name));
+ std::string markdownError = fmt("`nix %s` requires a sub-command. Available sub-commands:\n\n%s\n",
+ commandName, concatStringsSep("\n", subCommandTextLines));
+ throw UsageError(renderMarkdownToTerminal(markdownError));
+ }
+ command->second->run();
+}
+
StoreCommand::StoreCommand()
{
}
diff --git a/src/libcmd/command.hh b/src/libcmd/command.hh
index 2001f3a98..37b3ca4a7 100644
--- a/src/libcmd/command.hh
+++ b/src/libcmd/command.hh
@@ -27,9 +27,13 @@ static constexpr Command::Category catNixInstallation = 102;
static constexpr auto installablesCategory = "Options that change the interpretation of [installables](@docroot@/command-ref/new-cli/nix.md#installables)";
-struct NixMultiCommand : virtual MultiCommand, virtual Command
+struct NixMultiCommand : MultiCommand, virtual Command
{
nlohmann::json toJSON() override;
+
+ using MultiCommand::MultiCommand;
+
+ virtual void run() override;
};
// For the overloaded run methods
diff --git a/src/libcmd/common-eval-args.cc b/src/libcmd/common-eval-args.cc
index 193972272..b87bbbc27 100644
--- a/src/libcmd/common-eval-args.cc
+++ b/src/libcmd/common-eval-args.cc
@@ -9,6 +9,7 @@
#include "store-api.hh"
#include "command.hh"
#include "tarball.hh"
+#include "fetch-to-store.hh"
namespace nix {
@@ -19,7 +20,7 @@ MixEvalArgs::MixEvalArgs()
.description = "Pass the value *expr* as the argument *name* to Nix functions.",
.category = category,
.labels = {"name", "expr"},
- .handler = {[&](std::string name, std::string expr) { autoArgs[name] = 'E' + expr; }}
+ .handler = {[&](std::string name, std::string expr) { autoArgs.insert_or_assign(name, AutoArg{AutoArgExpr(expr)}); }}
});
addFlag({
@@ -27,7 +28,24 @@ MixEvalArgs::MixEvalArgs()
.description = "Pass the string *string* as the argument *name* to Nix functions.",
.category = category,
.labels = {"name", "string"},
- .handler = {[&](std::string name, std::string s) { autoArgs[name] = 'S' + s; }},
+ .handler = {[&](std::string name, std::string s) { autoArgs.insert_or_assign(name, AutoArg{AutoArgString(s)}); }},
+ });
+
+ addFlag({
+ .longName = "arg-from-file",
+ .description = "Pass the contents of file *path* as the argument *name* to Nix functions.",
+ .category = category,
+ .labels = {"name", "path"},
+ .handler = {[&](std::string name, std::string path) { autoArgs.insert_or_assign(name, AutoArg{AutoArgFile(path)}); }},
+ .completer = completePath
+ });
+
+ addFlag({
+ .longName = "arg-from-stdin",
+ .description = "Pass the contents of stdin as the argument *name* to Nix functions.",
+ .category = category,
+ .labels = {"name"},
+ .handler = {[&](std::string name) { autoArgs.insert_or_assign(name, AutoArg{AutoArgStdin{}}); }},
});
addFlag({
@@ -153,22 +171,33 @@ MixEvalArgs::MixEvalArgs()
Bindings * MixEvalArgs::getAutoArgs(EvalState & state)
{
auto res = state.buildBindings(autoArgs.size());
- for (auto & i : autoArgs) {
+ for (auto & [name, arg] : autoArgs) {
auto v = state.allocValue();
- if (i.second[0] == 'E')
- state.mkThunk_(*v, state.parseExprFromString(i.second.substr(1), state.rootPath(CanonPath::fromCwd())));
- else
- v->mkString(((std::string_view) i.second).substr(1));
- res.insert(state.symbols.create(i.first), v);
+ std::visit(overloaded {
+ [&](const AutoArgExpr & arg) {
+ state.mkThunk_(*v, state.parseExprFromString(arg.expr, state.rootPath(".")));
+ },
+ [&](const AutoArgString & arg) {
+ v->mkString(arg.s);
+ },
+ [&](const AutoArgFile & arg) {
+ v->mkString(readFile(arg.path));
+ },
+ [&](const AutoArgStdin & arg) {
+ v->mkString(readFile(STDIN_FILENO));
+ }
+ }, arg);
+ res.insert(state.symbols.create(name), v);
}
return res.finish();
}
-SourcePath lookupFileArg(EvalState & state, std::string_view s, CanonPath baseDir)
+SourcePath lookupFileArg(EvalState & state, std::string_view s, const Path * baseDir)
{
if (EvalSettings::isPseudoUrl(s)) {
- auto storePath = fetchers::downloadTarball(
- state.store, EvalSettings::resolvePseudoUrl(s), "source", false).storePath;
+ auto accessor = fetchers::downloadTarball(
+ EvalSettings::resolvePseudoUrl(s)).accessor;
+ auto storePath = fetchToStore(*state.store, SourcePath(accessor), FetchMode::Copy);
return state.rootPath(CanonPath(state.store->toRealPath(storePath)));
}
@@ -185,7 +214,7 @@ SourcePath lookupFileArg(EvalState & state, std::string_view s, CanonPath baseDi
}
else
- return state.rootPath(CanonPath(s, baseDir));
+ return state.rootPath(baseDir ? absPath(s, *baseDir) : absPath(s));
}
}
diff --git a/src/libcmd/common-eval-args.hh b/src/libcmd/common-eval-args.hh
index 4b403d936..25ce5b9da 100644
--- a/src/libcmd/common-eval-args.hh
+++ b/src/libcmd/common-eval-args.hh
@@ -6,6 +6,8 @@
#include "common-args.hh"
#include "search-path.hh"
+#include
+
namespace nix {
class Store;
@@ -26,9 +28,16 @@ struct MixEvalArgs : virtual Args, virtual MixRepair
std::optional evalStoreUrl;
private:
- std::map autoArgs;
+ struct AutoArgExpr { std::string expr; };
+ struct AutoArgString { std::string s; };
+ struct AutoArgFile { std::filesystem::path path; };
+ struct AutoArgStdin { };
+
+ using AutoArg = std::variant;
+
+ std::map autoArgs;
};
-SourcePath lookupFileArg(EvalState & state, std::string_view s, CanonPath baseDir = CanonPath::fromCwd());
+SourcePath lookupFileArg(EvalState & state, std::string_view s, const Path * baseDir = nullptr);
}
diff --git a/src/libcmd/editor-for.cc b/src/libcmd/editor-for.cc
index 619d3673f..6bf36bd64 100644
--- a/src/libcmd/editor-for.cc
+++ b/src/libcmd/editor-for.cc
@@ -1,5 +1,6 @@
#include "editor-for.hh"
#include "environment-variables.hh"
+#include "source-path.hh"
namespace nix {
@@ -16,7 +17,7 @@ Strings editorFor(const SourcePath & file, uint32_t line)
editor.find("vim") != std::string::npos ||
editor.find("kak") != std::string::npos))
args.push_back(fmt("+%d", line));
- args.push_back(path->abs());
+ args.push_back(path->string());
return args;
}
diff --git a/src/libcmd/editor-for.hh b/src/libcmd/editor-for.hh
index fbf4307c9..8acd7011e 100644
--- a/src/libcmd/editor-for.hh
+++ b/src/libcmd/editor-for.hh
@@ -2,7 +2,7 @@
///@file
#include "types.hh"
-#include "input-accessor.hh"
+#include "source-path.hh"
namespace nix {
diff --git a/src/libcmd/installable-attr-path.cc b/src/libcmd/installable-attr-path.cc
index 06e507872..3ec1c1614 100644
--- a/src/libcmd/installable-attr-path.cc
+++ b/src/libcmd/installable-attr-path.cc
@@ -58,22 +58,22 @@ DerivedPathsWithInfo InstallableAttrPath::toDerivedPaths()
Bindings & autoArgs = *cmd.getAutoArgs(*state);
- DrvInfos drvInfos;
- getDerivations(*state, *v, "", autoArgs, drvInfos, false);
+ PackageInfos packageInfos;
+ getDerivations(*state, *v, "", autoArgs, packageInfos, false);
// Backward compatibility hack: group results by drvPath. This
// helps keep .all output together.
std::map byDrvPath;
- for (auto & drvInfo : drvInfos) {
- auto drvPath = drvInfo.queryDrvPath();
+ for (auto & packageInfo : packageInfos) {
+ auto drvPath = packageInfo.queryDrvPath();
if (!drvPath)
throw Error("'%s' is not a derivation", what());
auto newOutputs = std::visit(overloaded {
[&](const ExtendedOutputsSpec::Default & d) -> OutputsSpec {
std::set outputsToInstall;
- for (auto & output : drvInfo.queryOutputs(false, true))
+ for (auto & output : packageInfo.queryOutputs(false, true))
outputsToInstall.insert(output.first);
return OutputsSpec::Names { std::move(outputsToInstall) };
},
diff --git a/src/libcmd/installable-flake.cc b/src/libcmd/installable-flake.cc
index 2f428cb7e..ddec7537b 100644
--- a/src/libcmd/installable-flake.cc
+++ b/src/libcmd/installable-flake.cc
@@ -52,7 +52,7 @@ Value * InstallableFlake::getFlakeOutputs(EvalState & state, const flake::Locked
auto aOutputs = vFlake->attrs->get(state.symbols.create("outputs"));
assert(aOutputs);
- state.forceValue(*aOutputs->value, [&]() { return aOutputs->value->determinePos(noPos); });
+ state.forceValue(*aOutputs->value, aOutputs->value->determinePos(noPos));
return aOutputs->value;
}
diff --git a/src/libcmd/installable-value.cc b/src/libcmd/installable-value.cc
index 08ad35105..1aa2e65c1 100644
--- a/src/libcmd/installable-value.cc
+++ b/src/libcmd/installable-value.cc
@@ -1,5 +1,6 @@
#include "installable-value.hh"
#include "eval-cache.hh"
+#include "fetch-to-store.hh"
namespace nix {
@@ -44,7 +45,7 @@ ref InstallableValue::require(ref installable)
std::optional InstallableValue::trySinglePathToDerivedPaths(Value & v, const PosIdx pos, std::string_view errorCtx)
{
if (v.type() == nPath) {
- auto storePath = v.path().fetchToStore(state->store);
+ auto storePath = fetchToStore(*state->store, v.path(), FetchMode::Copy);
return {{
.path = DerivedPath::Opaque {
.path = std::move(storePath),
diff --git a/src/libcmd/installable-value.hh b/src/libcmd/installable-value.hh
index 3138ce8ec..f300d392b 100644
--- a/src/libcmd/installable-value.hh
+++ b/src/libcmd/installable-value.hh
@@ -6,7 +6,7 @@
namespace nix {
-struct DrvInfo;
+struct PackageInfo;
struct SourceExprCommand;
namespace eval_cache { class EvalCache; class AttrCursor; }
diff --git a/src/libcmd/installables.cc b/src/libcmd/installables.cc
index c016ad039..95abc46cf 100644
--- a/src/libcmd/installables.cc
+++ b/src/libcmd/installables.cc
@@ -21,6 +21,7 @@
#include "url.hh"
#include "registry.hh"
#include "build-result.hh"
+#include "fs-input-accessor.hh"
#include
#include
@@ -150,7 +151,7 @@ MixFlakeOptions::MixFlakeOptions()
.category = category,
.labels = {"flake-lock-path"},
.handler = {[&](std::string lockFilePath) {
- lockFlags.referenceLockFilePath = lockFilePath;
+ lockFlags.referenceLockFilePath = getUnfilteredRootPath(CanonPath(absPath(lockFilePath)));
}},
.completer = completePath
});
@@ -325,9 +326,10 @@ void SourceExprCommand::completeInstallable(AddCompletions & completions, std::s
evalSettings.pureEval = false;
auto state = getEvalState();
- Expr *e = state->parseExprFromFile(
- resolveExprPath(state->checkSourcePath(lookupFileArg(*state, *file)))
- );
+ auto e =
+ state->parseExprFromFile(
+ resolveExprPath(
+ lookupFileArg(*state, *file)));
Value root;
state->eval(e, root);
@@ -518,10 +520,10 @@ ref openEvalCache(
EvalState & state,
std::shared_ptr lockedFlake)
{
- auto fingerprint = lockedFlake->getFingerprint();
+ auto fingerprint = lockedFlake->getFingerprint(state.store);
return make_ref(
evalSettings.useEvalCache && evalSettings.pureEval
- ? std::optional { std::cref(fingerprint) }
+ ? fingerprint
: std::nullopt,
state,
[&state, lockedFlake]()
@@ -597,12 +599,15 @@ Installables SourceExprCommand::parseInstallables(
state->eval(e, *vFile);
}
else if (file)
- state->evalFile(lookupFileArg(*state, *file, CanonPath::fromCwd(getCommandBaseDir())), *vFile);
+ auto dir = absPath(getCommandBaseDir());
+ state->evalFile(lookupFileArg(*state, *file, &dir), *vFile);
else if (callPackageFile) {
- auto e = state->parseExprFromString(fmt("(import {}).callPackage %s {}", CanonPath::fromCwd(*callPackageFile)), state->rootPath(CanonPath::fromCwd()));
+ auto dir = absPath(getCommandBaseDir());
+ auto fileLoc = absPath(*callPackageFile);
+ auto e = state->parseExprFromString(fmt("(import {}).callPackage %s {}", &fileLoc), state->rootPath(&dir));
state->eval(e, *vFile);
} else {
- CanonPath dir(CanonPath::fromCwd(getCommandBaseDir()));
+ Path dir = absPath(getCommandBaseDir());
auto e = state->parseExprFromString(*expr, state->rootPath(dir));
state->eval(e, *vFile);
}
@@ -859,7 +864,7 @@ BuiltPaths Installable::toBuiltPaths(
}
}
-StorePathSet Installable::toStorePaths(
+StorePathSet Installable::toStorePathSet(
ref evalStore,
ref store,
Realise mode, OperateOn operateOn,
@@ -873,13 +878,27 @@ StorePathSet Installable::toStorePaths(
return outPaths;
}
+StorePaths Installable::toStorePaths(
+ ref evalStore,
+ ref store,
+ Realise mode, OperateOn operateOn,
+ const Installables & installables)
+{
+ StorePaths outPaths;
+ for (auto & path : toBuiltPaths(evalStore, store, mode, operateOn, installables)) {
+ auto thisOutPaths = path.outPaths();
+ outPaths.insert(outPaths.end(), thisOutPaths.begin(), thisOutPaths.end());
+ }
+ return outPaths;
+}
+
StorePath Installable::toStorePath(
ref evalStore,
ref store,
Realise mode, OperateOn operateOn,
ref installable)
{
- auto paths = toStorePaths(evalStore, store, mode, operateOn, {installable});
+ auto paths = toStorePathSet(evalStore, store, mode, operateOn, {installable});
if (paths.size() != 1)
throw Error("argument '%s' should evaluate to one store path", installable->what());
diff --git a/src/libcmd/installables.hh b/src/libcmd/installables.hh
index e087f935c..bf5759230 100644
--- a/src/libcmd/installables.hh
+++ b/src/libcmd/installables.hh
@@ -12,7 +12,7 @@
namespace nix {
-struct DrvInfo;
+struct PackageInfo;
enum class Realise {
/**
@@ -165,7 +165,14 @@ struct Installable
const Installables & installables,
BuildMode bMode = bmNormal);
- static std::set toStorePaths(
+ static std::set toStorePathSet(
+ ref evalStore,
+ ref store,
+ Realise mode,
+ OperateOn operateOn,
+ const Installables & installables);
+
+ static std::vector toStorePaths(
ref evalStore,
ref store,
Realise mode,
diff --git a/src/libcmd/local.mk b/src/libcmd/local.mk
index afd35af08..abb7459a7 100644
--- a/src/libcmd/local.mk
+++ b/src/libcmd/local.mk
@@ -8,7 +8,7 @@ libcmd_SOURCES := $(wildcard $(d)/*.cc)
libcmd_CXXFLAGS += -I src/libutil -I src/libstore -I src/libexpr -I src/libmain -I src/libfetchers
-libcmd_LDFLAGS = $(EDITLINE_LIBS) $(LOWDOWN_LIBS) -pthread
+libcmd_LDFLAGS = $(EDITLINE_LIBS) $(LOWDOWN_LIBS) $(THREAD_LDFLAGS)
libcmd_LIBS = libstore libutil libexpr libmain libfetchers
diff --git a/src/libcmd/markdown.cc b/src/libcmd/markdown.cc
index 8b3bbc1b5..a4e3c5a77 100644
--- a/src/libcmd/markdown.cc
+++ b/src/libcmd/markdown.cc
@@ -4,12 +4,15 @@
#include "terminal.hh"
#include
+#if HAVE_LOWDOWN
#include
+#endif
namespace nix {
std::string renderMarkdownToTerminal(std::string_view markdown)
{
+#if HAVE_LOWDOWN
int windowWidth = getWindowSize().second;
struct lowdown_opts opts {
@@ -48,6 +51,9 @@ std::string renderMarkdownToTerminal(std::string_view markdown)
throw Error("allocation error while rendering Markdown");
return filterANSIEscapes(std::string(buf->data, buf->size), !shouldANSI());
+#else
+ return std::string(markdown);
+#endif
}
}
diff --git a/src/libcmd/misc-store-flags.cc b/src/libcmd/misc-store-flags.cc
new file mode 100644
index 000000000..e66d3f63b
--- /dev/null
+++ b/src/libcmd/misc-store-flags.cc
@@ -0,0 +1,121 @@
+#include "misc-store-flags.hh"
+
+namespace nix::flag
+{
+
+static void hashFormatCompleter(AddCompletions & completions, size_t index, std::string_view prefix)
+{
+ for (auto & format : hashFormats) {
+ if (hasPrefix(format, prefix)) {
+ completions.add(format);
+ }
+ }
+}
+
+Args::Flag hashFormatWithDefault(std::string && longName, HashFormat * hf)
+{
+ assert(*hf == nix::HashFormat::SRI);
+ return Args::Flag {
+ .longName = std::move(longName),
+ .description = "Hash format (`base16`, `nix32`, `base64`, `sri`). Default: `sri`.",
+ .labels = {"hash-format"},
+ .handler = {[hf](std::string s) {
+ *hf = parseHashFormat(s);
+ }},
+ .completer = hashFormatCompleter,
+ };
+}
+
+Args::Flag hashFormatOpt(std::string && longName, std::optional * ohf)
+{
+ return Args::Flag {
+ .longName = std::move(longName),
+ .description = "Hash format (`base16`, `nix32`, `base64`, `sri`).",
+ .labels = {"hash-format"},
+ .handler = {[ohf](std::string s) {
+ *ohf = std::optional{parseHashFormat(s)};
+ }},
+ .completer = hashFormatCompleter,
+ };
+}
+
+static void hashAlgoCompleter(AddCompletions & completions, size_t index, std::string_view prefix)
+{
+ for (auto & algo : hashAlgorithms)
+ if (hasPrefix(algo, prefix))
+ completions.add(algo);
+}
+
+Args::Flag hashAlgo(std::string && longName, HashAlgorithm * ha)
+{
+ return Args::Flag {
+ .longName = std::move(longName),
+ .description = "Hash algorithm (`md5`, `sha1`, `sha256`, or `sha512`).",
+ .labels = {"hash-algo"},
+ .handler = {[ha](std::string s) {
+ *ha = parseHashAlgo(s);
+ }},
+ .completer = hashAlgoCompleter,
+ };
+}
+
+Args::Flag hashAlgoOpt(std::string && longName, std::optional * oha)
+{
+ return Args::Flag {
+ .longName = std::move(longName),
+ .description = "Hash algorithm (`md5`, `sha1`, `sha256`, or `sha512`). Can be omitted for SRI hashes.",
+ .labels = {"hash-algo"},
+ .handler = {[oha](std::string s) {
+ *oha = std::optional{parseHashAlgo(s)};
+ }},
+ .completer = hashAlgoCompleter,
+ };
+}
+
+Args::Flag fileIngestionMethod(FileIngestionMethod * method)
+{
+ return Args::Flag {
+ .longName = "mode",
+ // FIXME indentation carefully made for context, this is messed up.
+ .description = R"(
+ How to compute the hash of the input.
+ One of:
+
+ - `nar` (the default): Serialises the input as an archive (following the [_Nix Archive Format_](https://edolstra.github.io/pubs/phd-thesis.pdf#page=101)) and passes that to the hash function.
+
+ - `flat`: Assumes that the input is a single file and directly passes it to the hash function;
+ )",
+ .labels = {"file-ingestion-method"},
+ .handler = {[method](std::string s) {
+ *method = parseFileIngestionMethod(s);
+ }},
+ };
+}
+
+Args::Flag contentAddressMethod(ContentAddressMethod * method)
+{
+ return Args::Flag {
+ .longName = "mode",
+ // FIXME indentation carefully made for context, this is messed up.
+ .description = R"(
+ How to compute the content-address of the store object.
+ One of:
+
+ - `nar` (the default): Serialises the input as an archive (following the [_Nix Archive Format_](https://edolstra.github.io/pubs/phd-thesis.pdf#page=101)) and passes that to the hash function.
+
+ - `flat`: Assumes that the input is a single file and directly passes it to the hash function;
+
+ - `text`: Like `flat`, but used for
+ [derivations](@docroot@/glossary.md#store-derivation) serialized in store object and
+ [`builtins.toFile`](@docroot@/language/builtins.html#builtins-toFile).
+ For advanced use-cases only;
+ for regular usage prefer `nar` and `flat.
+ )",
+ .labels = {"content-address-method"},
+ .handler = {[method](std::string s) {
+ *method = ContentAddressMethod::parse(s);
+ }},
+ };
+}
+
+}
diff --git a/src/libcmd/misc-store-flags.hh b/src/libcmd/misc-store-flags.hh
new file mode 100644
index 000000000..124372af7
--- /dev/null
+++ b/src/libcmd/misc-store-flags.hh
@@ -0,0 +1,21 @@
+#include "args.hh"
+#include "content-address.hh"
+
+namespace nix::flag {
+
+Args::Flag hashAlgo(std::string && longName, HashAlgorithm * ha);
+static inline Args::Flag hashAlgo(HashAlgorithm * ha)
+{
+ return hashAlgo("hash-algo", ha);
+}
+Args::Flag hashAlgoOpt(std::string && longName, std::optional * oha);
+Args::Flag hashFormatWithDefault(std::string && longName, HashFormat * hf);
+Args::Flag hashFormatOpt(std::string && longName, std::optional * ohf);
+static inline Args::Flag hashAlgoOpt(std::optional * oha)
+{
+ return hashAlgoOpt("hash-algo", oha);
+}
+Args::Flag fileIngestionMethod(FileIngestionMethod * method);
+Args::Flag contentAddressMethod(ContentAddressMethod * method);
+
+}
diff --git a/src/libcmd/repl-interacter.cc b/src/libcmd/repl-interacter.cc
new file mode 100644
index 000000000..3e34ecdb6
--- /dev/null
+++ b/src/libcmd/repl-interacter.cc
@@ -0,0 +1,186 @@
+#include
+
+#ifdef USE_READLINE
+#include
+#include
+#else
+// editline < 1.15.2 don't wrap their API for C++ usage
+// (added in https://github.com/troglobit/editline/commit/91398ceb3427b730995357e9d120539fb9bb7461).
+// This results in linker errors due to to name-mangling of editline C symbols.
+// For compatibility with these versions, we wrap the API here
+// (wrapping multiple times on newer versions is no problem).
+extern "C" {
+#include
+}
+#endif
+
+#include "signals.hh"
+#include "finally.hh"
+#include "repl-interacter.hh"
+#include "file-system.hh"
+#include "libcmd/repl.hh"
+
+namespace nix {
+
+namespace {
+// Used to communicate to NixRepl::getLine whether a signal occurred in ::readline.
+volatile sig_atomic_t g_signal_received = 0;
+
+void sigintHandler(int signo)
+{
+ g_signal_received = signo;
+}
+};
+
+static detail::ReplCompleterMixin * curRepl; // ugly
+
+static char * completionCallback(char * s, int * match)
+{
+ auto possible = curRepl->completePrefix(s);
+ if (possible.size() == 1) {
+ *match = 1;
+ auto * res = strdup(possible.begin()->c_str() + strlen(s));
+ if (!res)
+ throw Error("allocation failure");
+ return res;
+ } else if (possible.size() > 1) {
+ auto checkAllHaveSameAt = [&](size_t pos) {
+ auto & first = *possible.begin();
+ for (auto & p : possible) {
+ if (p.size() <= pos || p[pos] != first[pos])
+ return false;
+ }
+ return true;
+ };
+ size_t start = strlen(s);
+ size_t len = 0;
+ while (checkAllHaveSameAt(start + len))
+ ++len;
+ if (len > 0) {
+ *match = 1;
+ auto * res = strdup(std::string(*possible.begin(), start, len).c_str());
+ if (!res)
+ throw Error("allocation failure");
+ return res;
+ }
+ }
+
+ *match = 0;
+ return nullptr;
+}
+
+static int listPossibleCallback(char * s, char *** avp)
+{
+ auto possible = curRepl->completePrefix(s);
+
+ if (possible.size() > (INT_MAX / sizeof(char *)))
+ throw Error("too many completions");
+
+ int ac = 0;
+ char ** vp = nullptr;
+
+ auto check = [&](auto * p) {
+ if (!p) {
+ if (vp) {
+ while (--ac >= 0)
+ free(vp[ac]);
+ free(vp);
+ }
+ throw Error("allocation failure");
+ }
+ return p;
+ };
+
+ vp = check((char **) malloc(possible.size() * sizeof(char *)));
+
+ for (auto & p : possible)
+ vp[ac++] = check(strdup(p.c_str()));
+
+ *avp = vp;
+
+ return ac;
+}
+
+ReadlineLikeInteracter::Guard ReadlineLikeInteracter::init(detail::ReplCompleterMixin * repl)
+{
+ // Allow nix-repl specific settings in .inputrc
+ rl_readline_name = "nix-repl";
+ try {
+ createDirs(dirOf(historyFile));
+ } catch (SystemError & e) {
+ logWarning(e.info());
+ }
+#ifndef USE_READLINE
+ el_hist_size = 1000;
+#endif
+ read_history(historyFile.c_str());
+ auto oldRepl = curRepl;
+ curRepl = repl;
+ Guard restoreRepl([oldRepl] { curRepl = oldRepl; });
+#ifndef USE_READLINE
+ rl_set_complete_func(completionCallback);
+ rl_set_list_possib_func(listPossibleCallback);
+#endif
+ return restoreRepl;
+}
+
+static constexpr const char * promptForType(ReplPromptType promptType)
+{
+ switch (promptType) {
+ case ReplPromptType::ReplPrompt:
+ return "nix-repl> ";
+ case ReplPromptType::ContinuationPrompt:
+ return " ";
+ }
+ assert(false);
+}
+
+bool ReadlineLikeInteracter::getLine(std::string & input, ReplPromptType promptType)
+{
+ struct sigaction act, old;
+ sigset_t savedSignalMask, set;
+
+ auto setupSignals = [&]() {
+ act.sa_handler = sigintHandler;
+ sigfillset(&act.sa_mask);
+ act.sa_flags = 0;
+ if (sigaction(SIGINT, &act, &old))
+ throw SysError("installing handler for SIGINT");
+
+ sigemptyset(&set);
+ sigaddset(&set, SIGINT);
+ if (sigprocmask(SIG_UNBLOCK, &set, &savedSignalMask))
+ throw SysError("unblocking SIGINT");
+ };
+ auto restoreSignals = [&]() {
+ if (sigprocmask(SIG_SETMASK, &savedSignalMask, nullptr))
+ throw SysError("restoring signals");
+
+ if (sigaction(SIGINT, &old, 0))
+ throw SysError("restoring handler for SIGINT");
+ };
+
+ setupSignals();
+ char * s = readline(promptForType(promptType));
+ Finally doFree([&]() { free(s); });
+ restoreSignals();
+
+ if (g_signal_received) {
+ g_signal_received = 0;
+ input.clear();
+ return true;
+ }
+
+ if (!s)
+ return false;
+ input += s;
+ input += '\n';
+ return true;
+}
+
+ReadlineLikeInteracter::~ReadlineLikeInteracter()
+{
+ write_history(historyFile.c_str());
+}
+
+};
diff --git a/src/libcmd/repl-interacter.hh b/src/libcmd/repl-interacter.hh
new file mode 100644
index 000000000..cc70efd07
--- /dev/null
+++ b/src/libcmd/repl-interacter.hh
@@ -0,0 +1,48 @@
+#pragma once
+/// @file
+
+#include "finally.hh"
+#include "types.hh"
+#include
+#include
+
+namespace nix {
+
+namespace detail {
+/** Provides the completion hooks for the repl, without exposing its complete
+ * internals. */
+struct ReplCompleterMixin {
+ virtual StringSet completePrefix(const std::string & prefix) = 0;
+};
+};
+
+enum class ReplPromptType {
+ ReplPrompt,
+ ContinuationPrompt,
+};
+
+class ReplInteracter
+{
+public:
+ using Guard = Finally>;
+
+ virtual Guard init(detail::ReplCompleterMixin * repl) = 0;
+ /** Returns a boolean of whether the interacter got EOF */
+ virtual bool getLine(std::string & input, ReplPromptType promptType) = 0;
+ virtual ~ReplInteracter(){};
+};
+
+class ReadlineLikeInteracter : public virtual ReplInteracter
+{
+ std::string historyFile;
+public:
+ ReadlineLikeInteracter(std::string historyFile)
+ : historyFile(historyFile)
+ {
+ }
+ virtual Guard init(detail::ReplCompleterMixin * repl) override;
+ virtual bool getLine(std::string & input, ReplPromptType promptType) override;
+ virtual ~ReadlineLikeInteracter() override;
+};
+
+};
diff --git a/src/libcmd/repl.cc b/src/libcmd/repl.cc
index adba71206..4a501e575 100644
--- a/src/libcmd/repl.cc
+++ b/src/libcmd/repl.cc
@@ -3,32 +3,17 @@
#include
#include
-#include
-
-#ifdef READLINE
-#include
-#include
-#else
-// editline < 1.15.2 don't wrap their API for C++ usage
-// (added in https://github.com/troglobit/editline/commit/91398ceb3427b730995357e9d120539fb9bb7461).
-// This results in linker errors due to to name-mangling of editline C symbols.
-// For compatibility with these versions, we wrap the API here
-// (wrapping multiple times on newer versions is no problem).
-extern "C" {
-#include
-}
-#endif
-
+#include "libcmd/repl-interacter.hh"
#include "repl.hh"
#include "ansicolor.hh"
-#include "signals.hh"
#include "shared.hh"
#include "eval.hh"
#include "eval-cache.hh"
#include "eval-inline.hh"
#include "eval-settings.hh"
#include "attr-path.hh"
+#include "signals.hh"
#include "store-api.hh"
#include "log-store.hh"
#include "common-eval-args.hh"
@@ -38,7 +23,6 @@ extern "C" {
#include "flake/flake.hh"
#include "flake/lockfile.hh"
#include "users.hh"
-#include "terminal.hh"
#include "editor-for.hh"
#include "finally.hh"
#include "markdown.hh"
@@ -52,8 +36,30 @@ extern "C" {
namespace nix {
+/**
+ * Returned by `NixRepl::processLine`.
+ */
+enum class ProcessLineResult {
+ /**
+ * The user exited with `:quit`. The REPL should exit. The surrounding
+ * program or evaluation (e.g., if the REPL was acting as the debugger)
+ * should also exit.
+ */
+ Quit,
+ /**
+ * The user exited with `:continue`. The REPL should exit, but the program
+ * should continue running.
+ */
+ Continue,
+ /**
+ * The user did not exit. The REPL should request another line of input.
+ */
+ PromptAgain,
+};
+
struct NixRepl
: AbstractNixRepl
+ , detail::ReplCompleterMixin
#if HAVE_BOEHMGC
, gc
#endif
@@ -69,19 +75,18 @@ struct NixRepl
int displ;
StringSet varNames;
- const Path historyFile;
+ std::unique_ptr interacter;
NixRepl(const SearchPath & searchPath, nix::ref store,ref state,
std::function getValues);
- virtual ~NixRepl();
+ virtual ~NixRepl() = default;
- void mainLoop() override;
+ ReplExitStatus mainLoop() override;
void initEnv() override;
- StringSet completePrefix(const std::string & prefix);
- bool getLine(std::string & input, const std::string & prompt);
+ virtual StringSet completePrefix(const std::string & prefix) override;
StorePath getDerivationPath(Value & v);
- bool processLine(std::string line);
+ ProcessLineResult processLine(std::string line);
void loadFile(const Path & path);
void loadFlake(const std::string & flakeRef);
@@ -93,9 +98,19 @@ struct NixRepl
void evalString(std::string s, Value & v);
void loadDebugTraceEnv(DebugTrace & dt);
- typedef std::set ValuesSeen;
- std::ostream & printValue(std::ostream & str, Value & v, unsigned int maxDepth);
- std::ostream & printValue(std::ostream & str, Value & v, unsigned int maxDepth, ValuesSeen & seen);
+ void printValue(std::ostream & str,
+ Value & v,
+ unsigned int maxDepth = std::numeric_limits::max())
+ {
+ ::nix::printValue(*state, str, v, PrintOptions {
+ .ansiColors = true,
+ .force = true,
+ .derivationPaths = true,
+ .maxDepth = maxDepth,
+ .prettyIndent = 2,
+ .errors = ErrorPrintBehavior::ThrowTopLevel,
+ });
+ }
};
std::string removeWhitespace(std::string s)
@@ -112,17 +127,11 @@ NixRepl::NixRepl(const SearchPath & searchPath, nix::ref store, refstaticBaseEnv.get()))
- , historyFile(getDataDir() + "/nix/repl-history")
+ , staticEnv(new StaticEnv(nullptr, state->staticBaseEnv.get()))
+ , interacter(make_unique(getDataDir() + "/nix/repl-history"))
{
}
-
-NixRepl::~NixRepl()
-{
- write_history(historyFile.c_str());
-}
-
void runNix(Path program, const Strings & args,
const std::optional & input = {})
{
@@ -139,79 +148,6 @@ void runNix(Path program, const Strings & args,
return;
}
-static NixRepl * curRepl; // ugly
-
-static char * completionCallback(char * s, int *match) {
- auto possible = curRepl->completePrefix(s);
- if (possible.size() == 1) {
- *match = 1;
- auto *res = strdup(possible.begin()->c_str() + strlen(s));
- if (!res) throw Error("allocation failure");
- return res;
- } else if (possible.size() > 1) {
- auto checkAllHaveSameAt = [&](size_t pos) {
- auto &first = *possible.begin();
- for (auto &p : possible) {
- if (p.size() <= pos || p[pos] != first[pos])
- return false;
- }
- return true;
- };
- size_t start = strlen(s);
- size_t len = 0;
- while (checkAllHaveSameAt(start + len)) ++len;
- if (len > 0) {
- *match = 1;
- auto *res = strdup(std::string(*possible.begin(), start, len).c_str());
- if (!res) throw Error("allocation failure");
- return res;
- }
- }
-
- *match = 0;
- return nullptr;
-}
-
-static int listPossibleCallback(char *s, char ***avp) {
- auto possible = curRepl->completePrefix(s);
-
- if (possible.size() > (INT_MAX / sizeof(char*)))
- throw Error("too many completions");
-
- int ac = 0;
- char **vp = nullptr;
-
- auto check = [&](auto *p) {
- if (!p) {
- if (vp) {
- while (--ac >= 0)
- free(vp[ac]);
- free(vp);
- }
- throw Error("allocation failure");
- }
- return p;
- };
-
- vp = check((char **)malloc(possible.size() * sizeof(char*)));
-
- for (auto & p : possible)
- vp[ac++] = check(strdup(p.c_str()));
-
- *avp = vp;
-
- return ac;
-}
-
-namespace {
- // Used to communicate to NixRepl::getLine whether a signal occurred in ::readline.
- volatile sig_atomic_t g_signal_received = 0;
-
- void sigintHandler(int signo) {
- g_signal_received = signo;
- }
-}
-
static std::ostream & showDebugTrace(std::ostream & out, const PosTable & positions, const DebugTrace & dt)
{
if (dt.isError)
@@ -221,10 +157,10 @@ static std::ostream & showDebugTrace(std::ostream & out, const PosTable & positi
// prefer direct pos, but if noPos then try the expr.
auto pos = dt.pos
? dt.pos
- : static_cast>(positions[dt.expr.getPos() ? dt.expr.getPos() : noPos]);
+ : positions[dt.expr.getPos() ? dt.expr.getPos() : noPos];
if (pos) {
- out << pos;
+ out << *pos;
if (auto loc = pos->getCodeLines()) {
out << "\n";
printCodeLines(out, "", *pos, *loc);
@@ -235,31 +171,23 @@ static std::ostream & showDebugTrace(std::ostream & out, const PosTable & positi
return out;
}
-void NixRepl::mainLoop()
+static bool isFirstRepl = true;
+
+ReplExitStatus NixRepl::mainLoop()
{
- std::string error = ANSI_RED "error:" ANSI_NORMAL " ";
- notice("Welcome to Nix Super " + nixVersion + ". Type :? for help.\n");
+ if (isFirstRepl) {
+ std::string_view debuggerNotice = "";
+ if (state->debugRepl) {
+ debuggerNotice = " debugger";
+ }
+ notice("Nix Super %1%%2%\nType :? for help.", nixVersion, debuggerNotice);
+ }
+
+ isFirstRepl = false;
loadFiles();
- // Allow nix-repl specific settings in .inputrc
- rl_readline_name = "nix-repl";
- try {
- createDirs(dirOf(historyFile));
- } catch (SysError & e) {
- logWarning(e.info());
- }
-#ifndef READLINE
- el_hist_size = 1000;
-#endif
- read_history(historyFile.c_str());
- auto oldRepl = curRepl;
- curRepl = this;
- Finally restoreRepl([&] { curRepl = oldRepl; });
-#ifndef READLINE
- rl_set_complete_func(completionCallback);
- rl_set_list_possib_func(listPossibleCallback);
-#endif
+ auto _guard = interacter->init(static_cast(this));
std::string input;
@@ -268,16 +196,26 @@ void NixRepl::mainLoop()
logger->pause();
// When continuing input from previous lines, don't print a prompt, just align to the same
// number of chars as the prompt.
- if (!getLine(input, input.empty() ? "nix-repl> " : " ")) {
- // ctrl-D should exit the debugger.
+ if (!interacter->getLine(input, input.empty() ? ReplPromptType::ReplPrompt : ReplPromptType::ContinuationPrompt)) {
+ // Ctrl-D should exit the debugger.
state->debugStop = false;
- state->debugQuit = true;
logger->cout("");
- break;
+ // TODO: Should Ctrl-D exit just the current debugger session or
+ // the entire program?
+ return ReplExitStatus::QuitAll;
}
logger->resume();
try {
- if (!removeWhitespace(input).empty() && !processLine(input)) return;
+ switch (processLine(input)) {
+ case ProcessLineResult::Quit:
+ return ReplExitStatus::QuitAll;
+ case ProcessLineResult::Continue:
+ return ReplExitStatus::Continue;
+ case ProcessLineResult::PromptAgain:
+ break;
+ default:
+ abort();
+ }
} catch (ParseError & e) {
if (e.msg().find("unexpected end of file") != std::string::npos) {
// For parse errors on incomplete input, we continue waiting for the next line of
@@ -287,13 +225,7 @@ void NixRepl::mainLoop()
printMsg(lvlError, e.msg());
}
} catch (EvalError & e) {
- // in debugger mode, an EvalError should trigger another repl session.
- // when that session returns the exception will land here. No need to show it again;
- // show the error for this repl session instead.
- if (state->debugRepl && !state->debugTraces.empty())
- showDebugTrace(std::cout, state->positions, state->debugTraces.front());
- else
- printMsg(lvlError, e.msg());
+ printMsg(lvlError, e.msg());
} catch (Error & e) {
printMsg(lvlError, e.msg());
} catch (Interrupted & e) {
@@ -307,52 +239,6 @@ void NixRepl::mainLoop()
}
}
-
-bool NixRepl::getLine(std::string & input, const std::string & prompt)
-{
- struct sigaction act, old;
- sigset_t savedSignalMask, set;
-
- auto setupSignals = [&]() {
- act.sa_handler = sigintHandler;
- sigfillset(&act.sa_mask);
- act.sa_flags = 0;
- if (sigaction(SIGINT, &act, &old))
- throw SysError("installing handler for SIGINT");
-
- sigemptyset(&set);
- sigaddset(&set, SIGINT);
- if (sigprocmask(SIG_UNBLOCK, &set, &savedSignalMask))
- throw SysError("unblocking SIGINT");
- };
- auto restoreSignals = [&]() {
- if (sigprocmask(SIG_SETMASK, &savedSignalMask, nullptr))
- throw SysError("restoring signals");
-
- if (sigaction(SIGINT, &old, 0))
- throw SysError("restoring handler for SIGINT");
- };
-
- setupSignals();
- Finally resetTerminal([&]() { rl_deprep_terminal(); });
- char * s = readline(prompt.c_str());
- Finally doFree([&]() { free(s); });
- restoreSignals();
-
- if (g_signal_received) {
- g_signal_received = 0;
- input.clear();
- return true;
- }
-
- if (!s)
- return false;
- input += s;
- input += '\n';
- return true;
-}
-
-
StringSet NixRepl::completePrefix(const std::string & prefix)
{
StringSet completions;
@@ -414,8 +300,6 @@ StringSet NixRepl::completePrefix(const std::string & prefix)
// Quietly ignore parse errors.
} catch (EvalError & e) {
// Quietly ignore evaluation errors.
- } catch (UndefinedVarError & e) {
- // Quietly ignore undefined variable errors.
} catch (BadURL & e) {
// Quietly ignore BadURL flake-related errors.
}
@@ -442,10 +326,10 @@ static bool isVarName(std::string_view s)
StorePath NixRepl::getDerivationPath(Value & v) {
- auto drvInfo = getDerivation(*state, v, false);
- if (!drvInfo)
+ auto packageInfo = getDerivation(*state, v, false);
+ if (!packageInfo)
throw Error("expression does not evaluate to a derivation, so I can't build it");
- auto drvPath = drvInfo->queryDrvPath();
+ auto drvPath = packageInfo->queryDrvPath();
if (!drvPath)
throw Error("expression did not evaluate to a valid derivation (no 'drvPath' attribute)");
if (!state->store->isValidPath(*drvPath))
@@ -467,10 +351,11 @@ void NixRepl::loadDebugTraceEnv(DebugTrace & dt)
}
}
-bool NixRepl::processLine(std::string line)
+ProcessLineResult NixRepl::processLine(std::string line)
{
line = trim(line);
- if (line == "") return true;
+ if (line.empty())
+ return ProcessLineResult::PromptAgain;
_isInterrupted = false;
@@ -501,6 +386,7 @@ bool NixRepl::processLine(std::string line)
<< " :l, :load Load Nix expression and add it to scope\n"
<< " :lf, :load-flake [ Load Nix flake and add it to scope\n"
<< " :p, :print Evaluate and print expression recursively\n"
+ << " Strings are printed directly, without escaping.\n"
<< " :q, :quit Exit nix-repl\n"
<< " :r, :reload Reload all files\n"
<< " :sh Build dependencies of derivation, then start\n"
@@ -565,13 +451,13 @@ bool NixRepl::processLine(std::string line)
else if (state->debugRepl && (command == ":s" || command == ":step")) {
// set flag to stop at next DebugTrace; exit repl.
state->debugStop = true;
- return false;
+ return ProcessLineResult::Continue;
}
else if (state->debugRepl && (command == ":c" || command == ":continue")) {
// set flag to run to next breakpoint or end of program; exit repl.
state->debugStop = false;
- return false;
+ return ProcessLineResult::Continue;
}
else if (command == ":a" || command == ":add") {
@@ -708,13 +594,17 @@ bool NixRepl::processLine(std::string line)
else if (command == ":p" || command == ":print") {
Value v;
evalString(arg, v);
- printValue(std::cout, v, 1000000000) << std::endl;
+ if (v.type() == nString) {
+ std::cout << v.string_view();
+ } else {
+ printValue(std::cout, v);
+ }
+ std::cout << std::endl;
}
else if (command == ":q" || command == ":quit") {
state->debugStop = false;
- state->debugQuit = true;
- return false;
+ return ProcessLineResult::Quit;
}
else if (command == ":doc") {
@@ -770,11 +660,12 @@ bool NixRepl::processLine(std::string line)
} else {
Value v;
evalString(line, v);
- printValue(std::cout, v, 1) << std::endl;
+ printValue(std::cout, v, 1);
+ std::cout << std::endl;
}
}
- return true;
+ return ProcessLineResult::PromptAgain;
}
void NixRepl::loadFile(const Path & path)
@@ -880,7 +771,7 @@ void NixRepl::addVarToScope(const Symbol name, Value & v)
Expr * NixRepl::parseString(std::string s)
{
- return state->parseExprFromString(std::move(s), state->rootPath(CanonPath::fromCwd()), staticEnv);
+ return state->parseExprFromString(std::move(s), state->rootPath("."), staticEnv);
}
@@ -888,145 +779,7 @@ void NixRepl::evalString(std::string s, Value & v)
{
Expr * e = parseString(s);
e->eval(*state, *env, v);
- state->forceValue(v, [&]() { return v.determinePos(noPos); });
-}
-
-
-std::ostream & NixRepl::printValue(std::ostream & str, Value & v, unsigned int maxDepth)
-{
- ValuesSeen seen;
- return printValue(str, v, maxDepth, seen);
-}
-
-
-
-
-// FIXME: lot of cut&paste from Nix's eval.cc.
-std::ostream & NixRepl::printValue(std::ostream & str, Value & v, unsigned int maxDepth, ValuesSeen & seen)
-{
- str.flush();
- checkInterrupt();
-
- state->forceValue(v, [&]() { return v.determinePos(noPos); });
-
- switch (v.type()) {
-
- case nInt:
- str << ANSI_CYAN << v.integer << ANSI_NORMAL;
- break;
-
- case nBool:
- str << ANSI_CYAN;
- printLiteralBool(str, v.boolean);
- str << ANSI_NORMAL;
- break;
-
- case nString:
- str << ANSI_WARNING;
- printLiteralString(str, v.string_view());
- str << ANSI_NORMAL;
- break;
-
- case nPath:
- str << ANSI_GREEN << v.path().to_string() << ANSI_NORMAL; // !!! escaping?
- break;
-
- case nNull:
- str << ANSI_CYAN "null" ANSI_NORMAL;
- break;
-
- case nAttrs: {
- seen.insert(&v);
-
- bool isDrv = state->isDerivation(v);
-
- if (isDrv) {
- str << "«derivation ";
- Bindings::iterator i = v.attrs->find(state->sDrvPath);
- NixStringContext context;
- if (i != v.attrs->end())
- str << state->store->printStorePath(state->coerceToStorePath(i->pos, *i->value, context, "while evaluating the drvPath of a derivation"));
- else
- str << "???";
- str << "»";
- }
-
- else if (maxDepth > 0) {
- str << "{ ";
-
- typedef std::map Sorted;
- Sorted sorted;
- for (auto & i : *v.attrs)
- sorted.emplace(state->symbols[i.name], i.value);
-
- for (auto & i : sorted) {
- printAttributeName(str, i.first);
- str << " = ";
- if (seen.count(i.second))
- str << "«repeated»";
- else
- try {
- printValue(str, *i.second, maxDepth - 1, seen);
- } catch (AssertionError & e) {
- str << ANSI_RED "«error: " << e.msg() << "»" ANSI_NORMAL;
- }
- str << "; ";
- }
-
- str << "}";
- } else
- str << "{ ... }";
-
- break;
- }
-
- case nList:
- seen.insert(&v);
-
- str << "[ ";
- if (maxDepth > 0)
- for (auto elem : v.listItems()) {
- if (seen.count(elem))
- str << "«repeated»";
- else
- try {
- printValue(str, *elem, maxDepth - 1, seen);
- } catch (AssertionError & e) {
- str << ANSI_RED "«error: " << e.msg() << "»" ANSI_NORMAL;
- }
- str << " ";
- }
- else
- str << "... ";
- str << "]";
- break;
-
- case nFunction:
- if (v.isLambda()) {
- std::ostringstream s;
- s << state->positions[v.lambda.fun->pos];
- str << ANSI_BLUE "«lambda @ " << filterANSIEscapes(s.str()) << "»" ANSI_NORMAL;
- } else if (v.isPrimOp()) {
- str << ANSI_MAGENTA "«primop»" ANSI_NORMAL;
- } else if (v.isPrimOpApp()) {
- str << ANSI_BLUE "«primop-app»" ANSI_NORMAL;
- } else {
- abort();
- }
- break;
-
- case nFloat:
- str << v.fpoint;
- break;
-
- case nThunk:
- case nExternal:
- default:
- str << ANSI_RED "«unknown»" ANSI_NORMAL;
- break;
- }
-
- return str;
+ state->forceValue(v, v.determinePos(noPos));
}
@@ -1043,7 +796,7 @@ std::unique_ptr AbstractNixRepl::create(
}
-void AbstractNixRepl::runSimple(
+ReplExitStatus AbstractNixRepl::runSimple(
ref evalState,
const ValMap & extraEnv)
{
@@ -1065,7 +818,7 @@ void AbstractNixRepl::runSimple(
for (auto & [name, value] : extraEnv)
repl->addVarToScope(repl->state->symbols.create(name), *value);
- repl->mainLoop();
+ return repl->mainLoop();
}
}
diff --git a/src/libcmd/repl.hh b/src/libcmd/repl.hh
index 6d88883fe..aac79ec74 100644
--- a/src/libcmd/repl.hh
+++ b/src/libcmd/repl.hh
@@ -3,11 +3,6 @@
#include "eval.hh"
-#if HAVE_BOEHMGC
-#define GC_INCLUDE_NEW
-#include
-#endif
-
namespace nix {
struct AbstractNixRepl
@@ -28,13 +23,13 @@ struct AbstractNixRepl
const SearchPath & searchPath, nix::ref store, ref state,
std::function getValues);
- static void runSimple(
+ static ReplExitStatus runSimple(
ref evalState,
const ValMap & extraEnv);
virtual void initEnv() = 0;
- virtual void mainLoop() = 0;
+ virtual ReplExitStatus mainLoop() = 0;
};
}
diff --git a/src/libexpr/attr-path.cc b/src/libexpr/attr-path.cc
index 7481a2232..d6befd362 100644
--- a/src/libexpr/attr-path.cc
+++ b/src/libexpr/attr-path.cc
@@ -65,10 +65,10 @@ std::pair findAlongAttrPath(EvalState & state, const std::strin
if (!attrIndex) {
if (v->type() != nAttrs)
- throw TypeError(
+ state.error(
"the expression selected by the selection path '%1%' should be a set but is %2%",
attrPath,
- showType(*v));
+ showType(*v)).debugThrow();
if (attr.empty())
throw Error("empty attribute name in selection path '%1%'", attrPath);
@@ -88,10 +88,10 @@ std::pair findAlongAttrPath(EvalState & state, const std::strin
else {
if (!v->isList())
- throw TypeError(
+ state.error(
"the expression selected by the selection path '%1%' should be a list but is %2%",
attrPath,
- showType(*v));
+ showType(*v)).debugThrow();
if (*attrIndex >= v->listSize())
throw AttrPathNotFound("list index %1% in selection path '%2%' is out of range", *attrIndex, attrPath);
diff --git a/src/libexpr/eval-cache.cc b/src/libexpr/eval-cache.cc
index 5808d58b6..1538eb056 100644
--- a/src/libexpr/eval-cache.cc
+++ b/src/libexpr/eval-cache.cc
@@ -491,7 +491,7 @@ std::shared_ptr AttrCursor::maybeGetAttr(Symbol name, bool forceErro
if (forceErrors)
debug("reevaluating failed cached attribute '%s'", getAttrPathStr(name));
else
- throw CachedEvalError("cached failure of attribute '%s'", getAttrPathStr(name));
+ throw CachedEvalError(root->state, "cached failure of attribute '%s'", getAttrPathStr(name));
} else
return std::make_shared(root,
std::make_pair(shared_from_this(), name), nullptr, std::move(attr));
@@ -500,7 +500,7 @@ std::shared_ptr AttrCursor::maybeGetAttr(Symbol name, bool forceErro
// evaluate to see whether 'name' exists
} else
return nullptr;
- //throw TypeError("'%s' is not an attribute set", getAttrPathStr());
+ //error("'%s' is not an attribute set", getAttrPathStr()).debugThrow();
}
}
@@ -508,7 +508,7 @@ std::shared_ptr AttrCursor::maybeGetAttr(Symbol name, bool forceErro
if (v.type() != nAttrs)
return nullptr;
- //throw TypeError("'%s' is not an attribute set", getAttrPathStr());
+ //error("'%s' is not an attribute set", getAttrPathStr()).debugThrow();
auto attr = v.attrs->get(name);
@@ -574,14 +574,14 @@ std::string AttrCursor::getString()
debug("using cached string attribute '%s'", getAttrPathStr());
return s->first;
} else
- root->state.error("'%s' is not a string", getAttrPathStr()).debugThrow();
+ root->state.error("'%s' is not a string", getAttrPathStr()).debugThrow();
}
}
auto & v = forceValue();
if (v.type() != nString && v.type() != nPath)
- root->state.error("'%s' is not a string but %s", getAttrPathStr()).debugThrow();
+ root->state.error("'%s' is not a string but %s", getAttrPathStr(), showType(v)).debugThrow();
return v.type() == nString ? v.c_str() : v.path().to_string();
}
@@ -616,7 +616,7 @@ string_t AttrCursor::getStringWithContext()
return *s;
}
} else
- root->state.error("'%s' is not a string", getAttrPathStr()).debugThrow();
+ root->state.error("'%s' is not a string", getAttrPathStr()).debugThrow();
}
}
@@ -630,7 +630,7 @@ string_t AttrCursor::getStringWithContext()
else if (v.type() == nPath)
return {v.path().to_string(), {}};
else
- root->state.error("'%s' is not a string but %s", getAttrPathStr()).debugThrow();
+ root->state.error("'%s' is not a string but %s", getAttrPathStr(), showType(v)).debugThrow();
}
bool AttrCursor::getBool()
@@ -643,14 +643,14 @@ bool AttrCursor::getBool()
debug("using cached Boolean attribute '%s'", getAttrPathStr());
return *b;
} else
- root->state.error("'%s' is not a Boolean", getAttrPathStr()).debugThrow();
+ root->state.error("'%s' is not a Boolean", getAttrPathStr()).debugThrow();
}
}
auto & v = forceValue();
if (v.type() != nBool)
- root->state.error("'%s' is not a Boolean", getAttrPathStr()).debugThrow();
+ root->state.error("'%s' is not a Boolean", getAttrPathStr()).debugThrow();
return v.boolean;
}
@@ -665,14 +665,14 @@ NixInt AttrCursor::getInt()
debug("using cached integer attribute '%s'", getAttrPathStr());
return i->x;
} else
- throw TypeError("'%s' is not an integer", getAttrPathStr());
+ root->state.error("'%s' is not an integer", getAttrPathStr()).debugThrow();
}
}
auto & v = forceValue();
if (v.type() != nInt)
- throw TypeError("'%s' is not an integer", getAttrPathStr());
+ root->state.error("'%s' is not an integer", getAttrPathStr()).debugThrow();
return v.integer;
}
@@ -687,7 +687,7 @@ std::vector AttrCursor::getListOfStrings()
debug("using cached list of strings attribute '%s'", getAttrPathStr());
return *l;
} else
- throw TypeError("'%s' is not a list of strings", getAttrPathStr());
+ root->state.error("'%s' is not a list of strings", getAttrPathStr()).debugThrow();
}
}
@@ -697,7 +697,7 @@ std::vector AttrCursor::getListOfStrings()
root->state.forceValue(v, noPos);
if (v.type() != nList)
- throw TypeError("'%s' is not a list", getAttrPathStr());
+ root->state.error("'%s' is not a list", getAttrPathStr()).debugThrow();
std::vector res;
@@ -720,14 +720,14 @@ std::vector AttrCursor::getAttrs()
debug("using cached attrset attribute '%s'", getAttrPathStr());
return *attrs;
} else
- root->state.error("'%s' is not an attribute set", getAttrPathStr()).debugThrow();
+ root->state.error("'%s' is not an attribute set", getAttrPathStr()).debugThrow();
}
}
auto & v = forceValue();
if (v.type() != nAttrs)
- root->state.error("'%s' is not an attribute set", getAttrPathStr()).debugThrow();
+ root->state.error("'%s' is not an attribute set", getAttrPathStr()).debugThrow();
std::vector attrs;
for (auto & attr : *getValue().attrs)
diff --git a/src/libexpr/eval-error.cc b/src/libexpr/eval-error.cc
new file mode 100644
index 000000000..8db03610b
--- /dev/null
+++ b/src/libexpr/eval-error.cc
@@ -0,0 +1,105 @@
+#include "eval-error.hh"
+#include "eval.hh"
+#include "value.hh"
+
+namespace nix {
+
+template
+EvalErrorBuilder & EvalErrorBuilder::withExitStatus(unsigned int exitStatus)
+{
+ error.withExitStatus(exitStatus);
+ return *this;
+}
+
+template
+EvalErrorBuilder & EvalErrorBuilder::atPos(PosIdx pos)
+{
+ error.err.pos = error.state.positions[pos];
+ return *this;
+}
+
+template
+EvalErrorBuilder & EvalErrorBuilder::atPos(Value & value, PosIdx fallback)
+{
+ return atPos(value.determinePos(fallback));
+}
+
+template
+EvalErrorBuilder & EvalErrorBuilder::withTrace(PosIdx pos, const std::string_view text)
+{
+ error.err.traces.push_front(
+ Trace{.pos = error.state.positions[pos], .hint = HintFmt(std::string(text))});
+ return *this;
+}
+
+template
+EvalErrorBuilder & EvalErrorBuilder::withSuggestions(Suggestions & s)
+{
+ error.err.suggestions = s;
+ return *this;
+}
+
+template
+EvalErrorBuilder & EvalErrorBuilder::withFrame(const Env & env, const Expr & expr)
+{
+ // NOTE: This is abusing side-effects.
+ // TODO: check compatibility with nested debugger calls.
+ // TODO: What side-effects??
+ error.state.debugTraces.push_front(DebugTrace{
+ .pos = error.state.positions[expr.getPos()],
+ .expr = expr,
+ .env = env,
+ .hint = HintFmt("Fake frame for debugging purposes"),
+ .isError = true});
+ return *this;
+}
+
+template
+EvalErrorBuilder & EvalErrorBuilder::addTrace(PosIdx pos, HintFmt hint)
+{
+ error.addTrace(error.state.positions[pos], hint);
+ return *this;
+}
+
+template
+template
+EvalErrorBuilder &
+EvalErrorBuilder::addTrace(PosIdx pos, std::string_view formatString, const Args &... formatArgs)
+{
+
+ addTrace(error.state.positions[pos], HintFmt(std::string(formatString), formatArgs...));
+ return *this;
+}
+
+template
+void EvalErrorBuilder::debugThrow()
+{
+ if (error.state.debugRepl && !error.state.debugTraces.empty()) {
+ const DebugTrace & last = error.state.debugTraces.front();
+ const Env * env = &last.env;
+ const Expr * expr = &last.expr;
+ error.state.runDebugRepl(&error, *env, *expr);
+ }
+
+ // `EvalState` is the only class that can construct an `EvalErrorBuilder`,
+ // and it does so in dynamic storage. This is the final method called on
+ // any such instance and must delete itself before throwing the underlying
+ // error.
+ auto error = std::move(this->error);
+ delete this;
+
+ throw error;
+}
+
+template class EvalErrorBuilder;
+template class EvalErrorBuilder;
+template class EvalErrorBuilder;
+template class EvalErrorBuilder;
+template class EvalErrorBuilder;
+template class EvalErrorBuilder;
+template class EvalErrorBuilder;
+template class EvalErrorBuilder;
+template class EvalErrorBuilder;
+template class EvalErrorBuilder;
+
+}
diff --git a/src/libexpr/eval-error.hh b/src/libexpr/eval-error.hh
new file mode 100644
index 000000000..7e0cbe982
--- /dev/null
+++ b/src/libexpr/eval-error.hh
@@ -0,0 +1,104 @@
+#pragma once
+
+#include
+
+#include "error.hh"
+#include "pos-idx.hh"
+
+namespace nix {
+
+struct Env;
+struct Expr;
+struct Value;
+
+class EvalState;
+template
+class EvalErrorBuilder;
+
+class EvalError : public Error
+{
+ template
+ friend class EvalErrorBuilder;
+public:
+ EvalState & state;
+
+ EvalError(EvalState & state, ErrorInfo && errorInfo)
+ : Error(errorInfo)
+ , state(state)
+ {
+ }
+
+ template
+ explicit EvalError(EvalState & state, const std::string & formatString, const Args &... formatArgs)
+ : Error(formatString, formatArgs...)
+ , state(state)
+ {
+ }
+};
+
+MakeError(ParseError, Error);
+MakeError(AssertionError, EvalError);
+MakeError(ThrownError, AssertionError);
+MakeError(Abort, EvalError);
+MakeError(TypeError, EvalError);
+MakeError(UndefinedVarError, EvalError);
+MakeError(MissingArgumentError, EvalError);
+MakeError(CachedEvalError, EvalError);
+MakeError(InfiniteRecursionError, EvalError);
+
+struct InvalidPathError : public EvalError
+{
+public:
+ Path path;
+ InvalidPathError(EvalState & state, const Path & path)
+ : EvalError(state, "path '%s' is not valid", path)
+ {
+ }
+};
+
+/**
+ * `EvalErrorBuilder`s may only be constructed by `EvalState`. The `debugThrow`
+ * method must be the final method in any such `EvalErrorBuilder` usage, and it
+ * handles deleting the object.
+ */
+template
+class EvalErrorBuilder final
+{
+ friend class EvalState;
+
+ template
+ explicit EvalErrorBuilder(EvalState & state, const Args &... args)
+ : error(T(state, args...))
+ {
+ }
+
+public:
+ T error;
+
+ [[nodiscard, gnu::noinline]] EvalErrorBuilder & withExitStatus(unsigned int exitStatus);
+
+ [[nodiscard, gnu::noinline]] EvalErrorBuilder & atPos(PosIdx pos);
+
+ [[nodiscard, gnu::noinline]] EvalErrorBuilder & atPos(Value & value, PosIdx fallback = noPos);
+
+ [[nodiscard, gnu::noinline]] EvalErrorBuilder & withTrace(PosIdx pos, const std::string_view text);
+
+ [[nodiscard, gnu::noinline]] EvalErrorBuilder & withFrameTrace(PosIdx pos, const std::string_view text);
+
+ [[nodiscard, gnu::noinline]] EvalErrorBuilder & withSuggestions(Suggestions & s);
+
+ [[nodiscard, gnu::noinline]] EvalErrorBuilder & withFrame(const Env & e, const Expr & ex);
+
+ [[nodiscard, gnu::noinline]] EvalErrorBuilder & addTrace(PosIdx pos, HintFmt hint);
+
+ template
+ [[nodiscard, gnu::noinline]] EvalErrorBuilder &
+ addTrace(PosIdx pos, std::string_view formatString, const Args &... formatArgs);
+
+ /**
+ * Delete the `EvalErrorBuilder` and throw the underlying exception.
+ */
+ [[gnu::noinline, gnu::noreturn]] void debugThrow();
+};
+
+}
diff --git a/src/libexpr/eval-inline.hh b/src/libexpr/eval-inline.hh
index a988fa40c..03320c7c9 100644
--- a/src/libexpr/eval-inline.hh
+++ b/src/libexpr/eval-inline.hh
@@ -1,7 +1,9 @@
#pragma once
///@file
+#include "print.hh"
#include "eval.hh"
+#include "eval-error.hh"
namespace nix {
@@ -73,8 +75,6 @@ Env & EvalState::allocEnv(size_t size)
#endif
env = (Env *) allocBytes(sizeof(Env) + size * sizeof(Value *));
- env->type = Env::Plain;
-
/* We assume that env->values has been cleared by the allocator; maybeThunk() and lookupVar fromWith expect this. */
return *env;
@@ -83,13 +83,6 @@ Env & EvalState::allocEnv(size_t size)
[[gnu::always_inline]]
void EvalState::forceValue(Value & v, const PosIdx pos)
-{
- forceValue(v, [&]() { return pos; });
-}
-
-
-template
-void EvalState::forceValue(Value & v, Callable getPos)
{
if (v.isThunk()) {
Env * env = v.thunk.env;
@@ -100,13 +93,12 @@ void EvalState::forceValue(Value & v, Callable getPos)
expr->eval(*this, *env, v);
} catch (...) {
v.mkThunk(env, expr);
+ tryFixupBlackHolePos(v, pos);
throw;
}
}
else if (v.isApp())
- callFunction(*v.app.left, *v.app.right, v, noPos);
- else if (v.isBlackhole())
- error("infinite recursion encountered").atPos(getPos()).template debugThrow();
+ callFunction(*v.app.left, *v.app.right, v, pos);
}
@@ -121,10 +113,14 @@ template
[[gnu::always_inline]]
inline void EvalState::forceAttrs(Value & v, Callable getPos, std::string_view errorCtx)
{
- forceValue(v, noPos);
+ PosIdx pos = getPos();
+ forceValue(v, pos);
if (v.type() != nAttrs) {
- PosIdx pos = getPos();
- error("value is %1% while a set was expected", showType(v)).withTrace(pos, errorCtx).debugThrow();
+ error(
+ "expected a set but found %1%: %2%",
+ showType(v),
+ ValuePrinter(*this, v, errorPrintOptions)
+ ).withTrace(pos, errorCtx).debugThrow();
}
}
@@ -132,9 +128,13 @@ inline void EvalState::forceAttrs(Value & v, Callable getPos, std::string_view e
[[gnu::always_inline]]
inline void EvalState::forceList(Value & v, const PosIdx pos, std::string_view errorCtx)
{
- forceValue(v, noPos);
+ forceValue(v, pos);
if (!v.isList()) {
- error("value is %1% while a list was expected", showType(v)).withTrace(pos, errorCtx).debugThrow();
+ error(
+ "expected a list but found %1%: %2%",
+ showType(v),
+ ValuePrinter(*this, v, errorPrintOptions)
+ ).withTrace(pos, errorCtx).debugThrow();
}
}
diff --git a/src/libexpr/eval-settings.cc b/src/libexpr/eval-settings.cc
index 444a7d7d6..2ccbe327f 100644
--- a/src/libexpr/eval-settings.cc
+++ b/src/libexpr/eval-settings.cc
@@ -89,6 +89,12 @@ std::string EvalSettings::resolvePseudoUrl(std::string_view url)
return std::string(url);
}
+const std::string & EvalSettings::getCurrentSystem()
+{
+ const auto & evalSystem = currentSystem.get();
+ return evalSystem != "" ? evalSystem : settings.thisSystem.get();
+}
+
EvalSettings evalSettings;
static GlobalConfig::Register rEvalSettings(&evalSettings);
diff --git a/src/libexpr/eval-settings.hh b/src/libexpr/eval-settings.hh
index db2971acb..60d3a6f25 100644
--- a/src/libexpr/eval-settings.hh
+++ b/src/libexpr/eval-settings.hh
@@ -21,12 +21,45 @@ struct EvalSettings : Config
Setting nixPath{
this, getDefaultNixPath(), "nix-path",
R"(
- List of directories to be searched for `<...>` file references
+ List of search paths to use for [lookup path](@docroot@/language/constructs/lookup-path.md) resolution.
+ This setting determines the value of
+ [`builtins.nixPath`](@docroot@/language/builtin-constants.md#builtins-nixPath) and can be used with [`builtins.findFile`](@docroot@/language/builtin-constants.md#builtins-findFile).
- In particular, outside of [pure evaluation mode](#conf-pure-eval), this determines the value of
- [`builtins.nixPath`](@docroot@/language/builtin-constants.md#builtins-nixPath).
+ The default value is
+
+ ```
+ $HOME/.nix-defexpr/channels
+ nixpkgs=$NIX_STATE_DIR/profiles/per-user/root/channels/nixpkgs
+ $NIX_STATE_DIR/profiles/per-user/root/channels
+ ```
+
+ It can be overridden with the [`NIX_PATH` environment variable](@docroot@/command-ref/env-common.md#env-NIX_PATH) or the [`-I` command line option](@docroot@/command-ref/opt-common.md#opt-I).
+
+ > **Note**
+ >
+ > If [pure evaluation](#conf-pure-eval) is enabled, `nixPath` evaluates to the empty list `[ ]`.
+ )", {}, false};
+
+ Setting currentSystem{
+ this, "", "eval-system",
+ R"(
+ This option defines
+ [`builtins.currentSystem`](@docroot@/language/builtin-constants.md#builtins-currentSystem)
+ in the Nix language if it is set as a non-empty string.
+ Otherwise, if it is defined as the empty string (the default), the value of the
+ [`system` ](#conf-system)
+ configuration setting is used instead.
+
+ Unlike `system`, this setting does not change what kind of derivations can be built locally.
+ This is useful for evaluating Nix code on one system to produce derivations to be built on another type of system.
)"};
+ /**
+ * Implements the `eval-system` vs `system` defaulting logic
+ * described for `eval-system`.
+ */
+ const std::string & getCurrentSystem();
+
Setting restrictEval{
this, false, "restrict-eval",
R"(
@@ -35,8 +68,6 @@ struct EvalSettings : Config
[`builtins.nixPath`](@docroot@/language/builtin-constants.md#builtins-nixPath),
or to URIs outside of
[`allowed-uris`](@docroot@/command-ref/conf-file.md#conf-allowed-uris).
-
- Also the default value for [`nix-path`](#conf-nix-path) is ignored, such that only explicitly set search path entries are taken into account.
)"};
Setting pureEval{this, false, "pure-eval",
@@ -45,9 +76,10 @@ struct EvalSettings : Config
- Restrict file system and network access to files specified by cryptographic hash
- Disable impure constants:
- - [`bultins.currentSystem`](@docroot@/language/builtin-constants.md#builtins-currentSystem)
+ - [`builtins.currentSystem`](@docroot@/language/builtin-constants.md#builtins-currentSystem)
- [`builtins.currentTime`](@docroot@/language/builtin-constants.md#builtins-currentTime)
- [`builtins.nixPath`](@docroot@/language/builtin-constants.md#builtins-nixPath)
+ - [`builtins.storePath`](@docroot@/language/builtin-constants.md#builtins-storePath)
)"
};
@@ -68,6 +100,11 @@ struct EvalSettings : Config
evaluation mode. For example, when set to
`https://github.com/NixOS`, builtin functions such as `fetchGit` are
allowed to access `https://github.com/NixOS/patchelf.git`.
+
+ Access is granted when
+ - the URI is equal to the prefix,
+ - or the URI is a subpath of the prefix,
+ - or the prefix is a URI scheme ended by a colon `:` and the URI has the same scheme.
)"};
Setting traceFunctionCalls{this, false, "trace-function-calls",
@@ -99,6 +136,19 @@ struct EvalSettings : Config
Setting traceVerbose{this, false, "trace-verbose",
"Whether `builtins.traceVerbose` should trace its first argument when evaluated."};
+
+ Setting maxCallDepth{this, 10000, "max-call-depth",
+ "The maximum function call depth to allow before erroring."};
+
+ Setting builtinsTraceDebugger{this, false, "debugger-on-trace",
+ R"(
+ If set to true and the `--debugger` flag is given,
+ [`builtins.trace`](@docroot@/language/builtins.md#builtins-trace) will
+ enter the debugger like
+ [`builtins.break`](@docroot@/language/builtins.md#builtins-break).
+
+ This is useful for debugging warnings in third-party Nix code.
+ )"};
};
extern EvalSettings evalSettings;
diff --git a/src/libexpr/eval.cc b/src/libexpr/eval.cc
index 7e68e6f9b..5e2f71649 100644
--- a/src/libexpr/eval.cc
+++ b/src/libexpr/eval.cc
@@ -2,6 +2,8 @@
#include "eval-settings.hh"
#include "hash.hh"
#include "primops.hh"
+#include "print-options.hh"
+#include "shared.hh"
#include "types.hh"
#include "util.hh"
#include "store-api.hh"
@@ -14,9 +16,15 @@
#include "profiles.hh"
#include "print.hh"
#include "fs-input-accessor.hh"
+#include "filtering-input-accessor.hh"
#include "memory-input-accessor.hh"
#include "signals.hh"
#include "gc-small-vector.hh"
+#include "url.hh"
+#include "fetch-to-store.hh"
+#include "tarball.hh"
+#include "flake/flakeref.hh"
+#include "parser-tab.hh"
#include
#include
@@ -26,9 +34,9 @@
#include
#include
#include
-#include
#include
#include
+#include
#include
#include
@@ -102,116 +110,23 @@ RootValue allocRootValue(Value * v)
#endif
}
-void Value::print(const SymbolTable &symbols, std::ostream &str,
- std::set] |