poetry install | SolverProblemError Because my_project depends on string (*) which doesn't match any versions, version solving failed - solver

I am yet to use poetry to run project, so excuse lack of understanding.
I successfully installed the poetry python library manager, using:
curl -sSL https://raw.githubusercontent.com/python-poetry/poetry/master/get-poetry.py | python3
Next step poetry install initially returned this error:
me#LAPTOP-G1DAPU88:~/.ssh/workers-python/workers$ poetry install
RuntimeError
Poetry could not find a pyproject.toml file in /home/me/.ssh/workers-python/workers or its parents
at ~/.poetry/lib/poetry/_vendor/py3.8/poetry/core/factory.py:369 in locate
365│ if poetry_file.exists():
366│ return poetry_file
367│
368│ else:
→ 369│ raise RuntimeError(
370│ "Poetry could not find a pyproject.toml file in {} or its parents".format(
371│ cwd
372│ )
373│ )
I soon realised I needed my own made pyproject.toml file. Running poetry install again yielded:
$ poetry install
TOMLError
Invalid TOML file /home/me/.ssh/workers-python/workers/pyproject.toml: Key "json " already exists.
at ~/.poetry/lib/poetry/_vendor/py3.8/poetry/core/toml/file.py:34 in read
30│ def read(self): # type: () -> "TOMLDocument"
31│ try:
32│ return super(TOMLFile, self).read()
33│ except (ValueError, TOMLKitError) as e:
→ 34│ raise TOMLError("Invalid TOML file {}: {}".format(self.path.as_posix(), e))
35│
36│ def __getattr__(self, item): # type: (str) -> Any
37│ return getattr(self.__path, item)
38│
Above error indicates there were duplicate entries.
Running poetry install again with the now updated pyproject.toml file in cwd threw this error (in the post's title):
$ poetry install
Creating virtualenv my_project-1_EUeV5I-py3.8 in /home/me/.cache/pypoetry/virtualenvs
Updating dependencies
Resolving dependencies... (28.4s)
SolverProblemError
Because my_project depends on string (*) which doesn't match any versions, version solving failed.
at ~/.poetry/lib/poetry/puzzle/solver.py:241 in _solve
237│ packages = result.packages
238│ except OverrideNeeded as e:
239│ return self.solve_in_compatibility_mode(e.overrides, use_latest=use_latest)
240│ except SolveFailure as e:
→ 241│ raise SolverProblemError(e)
242│
243│ results = dict(
244│ depth_first_search(
245│ PackageNode(self._package, packages), aggregate_package_nodes
However, temporarily removing all instances = "*" gave me this error of \n on line 12... which doesn't appear to be there:
$ poetry install
TOMLError
Invalid TOML file /home/me/.ssh/workers-python/workers/pyproject.toml: Unexpected character: '\n' at line 12 col 5
at ~/.poetry/lib/poetry/_vendor/py3.8/poetry/core/toml/file.py:34 in read
30│ def read(self): # type: () -> "TOMLDocument"
31│ try:
32│ return super(TOMLFile, self).read()
33│ except (ValueError, TOMLKitError) as e:
→ 34│ raise TOMLError("Invalid TOML file {}: {}".format(self.path.as_posix(), e))
35│
36│ def __getattr__(self, item): # type: (str) -> Any
37│ return getattr(self.__path, item)
38│
me#LAPTOP-G1DAPU88:~/.ssh/workers-python/workers$ cat pyproject.toml
[tool.poetry]
name = "my_project"
version = "0.1.0"
description = "Top-level package for my_project."
authors = [""]
packages = [
{ include = "my_project"},
]
[tool.poetry.dependencies]
python = "^3.8"
click # Suspect line
I have reverted this.
Current pyproject.toml:
[tool.poetry]
name = "data_simulator"
version = "0.1.0"
description = "Top-level package for data_simulator."
authors = ["iotahoe <iotahoe#iotahoe.com>"] # daniel.bell#hitachivantara.com / daniel#iotahoe.com
packages = [
{ include = "data_simulator"},
]
[tool.poetry.dependencies]
python = "^3.8"
click = "*"
#logging = "*"
#os = "*"
#pathlib = "*"
#time = "*"
numpy = "*"
pandas = "*"
#json = "*"
#random = "*"
faker = "*"
transformers = "4.4.2"
#re = "*"
#itertools = "*"
#datetime = "*"
#requests = "*"
#copy = "*"
#collections = "*"
#collections.abc = "*"
#multiprocessing = "*"
#multiprocessing.dummy = "*"
nltk = "*"
#nltk.corpus = "*"
#string = "*"
[tool.poetry.dev-dependencies]
isort = "5.6.4"
black = "^20.8b1"
invoke = "^1.4.1"
coveralls = "^2.2.0"
pytest = "^3.0"
flake8 = "^3.8.3"
mypy = "^0.782"
[[tool.poetry.source]]
name = "azure"
url = "https://pkgs.dev.azure.com/iotahoe/Halo/_packaging/private-sources/pypi/simple/"
secondary = true
[build-system]
requires = ["poetry>=0.12"]
build-backend = "poetry.masonry.api"
Note: 'name', 'authors', 'include', 'url' have been censored.

As a general advise I recommend to use poetry's command line instead of creating/manipulating the pyproject.toml.
Start with a poetry init or poetry init -n and add your dependencies with poetry add.
The problem with your current pyproject.toml is, that you declare built-in packages as dependencies, like os, pathlib, string and others. This is why you receive the message Because my_project depends on string (*) which doesn't match any versions, version solving failed., which means poetry cannot find any matching package information in the repository.

tl;dr: Flush the *.egg-info directories before running poetry lock.
This answer is not strictly related to the current issue, but a similar error message can appear in other circumstances, so I think it's valuable to share it here.
If you are locking in a project where sub-dependencies are directly available on the file system, some *.egg-info directories may interfere with the locking process, causing issues when trying to run poetry install in a context where those *.egg-info files are missing. To avoid the problem: Flush the *.egg-info directories prior to locking. You should then have an updated poetry.lock file with more content.

Related

MyST-Parser: Auto linking / linkifying references to bug tracker issues

I use sphinx w/ MyST-Parser for markdown, and
I want GitHub or GitLab-style auto linking (linkfying) for references.
Is there a way to have MyST render the reference:
#346
In docutils-speak, this is a Text node (example)
And behave as if it was:
[#346](https://github.com/vcs-python/libvcs/pull/346)
So when rendered it'd be like:
#346
Not the custom role:
{issue}`1` <- Not this
Another example: Linkifying the reference #user to a GitHub, GitLab, StackOverflow user.
What I'm currently doing (and why it doesn't work)
Right now I'm using the canonical solution docutils offers: custom roles.
I use sphinx-issues (PyPI), and does just that. It uses a sphinx setting variable, issues_github_path to parse the URL:
e.g. in Sphinx configuration conf.py:
issues_github_path = 'vcs-python/libvcs'
reStructuredText:
:issue:`346`
MyST-Parser:
{issue}`346`
Why custom roles don't work
Sadly, those aren't bi-directional with GitHub/GitLab/tools. If you copy/paste MyST-Parser -> GitHub/GitLab or preview it directly, it looks very bad:
Example of CHANGES:
Example issue: https://github.com/vcs-python/libvcs/issues/363
What we want is to just be able to copy markdown including #347 to and from.
Does a solution already exist?
Are there any projects out there of docutils or sphinx plugins to turn #username or #issues into links?
sphinx (at least) can demonstrable do so for custom roles - as seen in sphinx-issues usage of issues_github_path - by using project configuration context.
MyST-Parser has a linkify extension which uses linkify-it-py
This can turn https://www.google.com into https://www.google.com and not need to use <https://www.google.com>.
Therefore, there may already be a tool out there.
Can it be done through the API?
The toolchain for myst, sphinx and docutils is robust. This is a special case.
This needs to be done at the Text node level. Custom role won't work - as stated above - since it'll create markdown that can't be copied between GitLab and GitHub issues trivially.
The stack:
MyST-Parser API (Markdown-it-py API) > Sphinx APIs (MySTParser + Sphinx) > Docutils API
At the time of writing, I'm using Sphinx 4.3.2, MyST-Parser 0.17.2, and docutils 0.17.1 on python 3.10.2.
Notes
For the sake of an example, I'm using an open source project of mine that is facing this issue.
This is only about autolinking issues or usernames - things that'd easily be mappable to URLs. autodoc code-linking is out of scope.
There is a (defunct) project that does this: sphinxcontrib-issuetracker.
I've rebooted it:
conf.py:
import sys
from pathlib import Path
cwd = Path(__file__).parent
project_root = cwd.parent
sys.path.insert(0, str(project_root))
sys.path.insert(0, str(cwd / "_ext"))
extensions = [
"link_issues",
]
# issuetracker
issuetracker = "github"
issuetracker_project = "cihai/unihan-etl" # e.g. for https://github.com/cihai/unihan-etl
_ext/link_issues.py:
"""Issue linking w/ plain-text autolinking, e.g. #42
Credit: https://github.com/ignatenkobrain/sphinxcontrib-issuetracker
License: BSD
Changes by Tony Narlock (2022-08-21):
- Type annotations
mypy --strict, requires types-requests, types-docutils
Python < 3.10 require typing-extensions
- TrackerConfig: Use dataclasses instead of typing.NamedTuple and hacking __new__
- app.warn (removed in 5.0) -> Use Sphinx Logging API
https://www.sphinx-doc.org/en/master/extdev/logging.html#logging-api
- Add PendingIssueXRef
Typing for tracker_config and precision
- Add IssueTrackerBuildEnvironment
Subclassed / typed BuildEnvironment with .tracker_config
- Just GitHub (for demonstration)
"""
import dataclasses
import re
import sys
import time
import typing as t
import requests
from docutils import nodes
from sphinx.addnodes import pending_xref
from sphinx.application import Sphinx
from sphinx.config import Config
from sphinx.environment import BuildEnvironment
from sphinx.transforms import SphinxTransform
from sphinx.util import logging
if t.TYPE_CHECKING:
if sys.version_info >= (3, 10):
from typing import TypeGuard
else:
from typing_extensions import TypeGuard
logger = logging.getLogger(__name__)
GITHUB_API_URL = "https://api.github.com/repos/{0.project}/issues/{1}"
class IssueTrackerBuildEnvironment(BuildEnvironment):
tracker_config: "TrackerConfig"
issuetracker_cache: "IssueTrackerCache"
github_rate_limit: t.Tuple[float, bool]
class Issue(t.NamedTuple):
id: str
title: str
url: str
closed: bool
IssueTrackerCache = t.Dict[str, Issue]
#dataclasses.dataclass
class TrackerConfig:
project: str
url: str
"""
Issue tracker configuration.
This class provides configuration for trackers, and is passed as
``tracker_config`` arguments to callbacks of
:event:`issuetracker-lookup-issue`.
"""
def __post_init__(self) -> None:
if self.url is not None:
self.url = self.url.rstrip("/")
#classmethod
def from_sphinx_config(cls, config: Config) -> "TrackerConfig":
"""
Get tracker configuration from ``config``.
"""
project = config.issuetracker_project or config.project
url = config.issuetracker_url
return cls(project=project, url=url)
class PendingIssueXRef(pending_xref):
tracker_config: TrackerConfig
class IssueReferences(SphinxTransform):
default_priority = 999
def apply(self) -> None:
config = self.document.settings.env.config
tracker_config = TrackerConfig.from_sphinx_config(config)
issue_pattern = config.issuetracker_issue_pattern
title_template = None
if isinstance(issue_pattern, str):
issue_pattern = re.compile(issue_pattern)
for node in self.document.traverse(nodes.Text):
parent = node.parent
if isinstance(parent, (nodes.literal, nodes.FixedTextElement)):
# ignore inline and block literal text
continue
if isinstance(parent, nodes.reference):
continue
text = str(node)
new_nodes = []
last_issue_ref_end = 0
for match in issue_pattern.finditer(text):
# catch invalid pattern with too many groups
if len(match.groups()) != 1:
raise ValueError(
"issuetracker_issue_pattern must have "
"exactly one group: {0!r}".format(match.groups())
)
# extract the text between the last issue reference and the
# current issue reference and put it into a new text node
head = text[last_issue_ref_end : match.start()]
if head:
new_nodes.append(nodes.Text(head))
# adjust the position of the last issue reference in the
# text
last_issue_ref_end = match.end()
# extract the issue text (including the leading dash)
issuetext = match.group(0)
# extract the issue number (excluding the leading dash)
issue_id = match.group(1)
# turn the issue reference into a reference node
refnode = PendingIssueXRef()
refnode["refdomain"] = None
refnode["reftarget"] = issue_id
refnode["reftype"] = "issue"
refnode["trackerconfig"] = tracker_config
reftitle = title_template or issuetext
refnode.append(
nodes.inline(issuetext, reftitle, classes=["xref", "issue"])
)
new_nodes.append(refnode)
if not new_nodes:
# no issue references were found, move on to the next node
continue
# extract the remaining text after the last issue reference, and
# put it into a text node
tail = text[last_issue_ref_end:]
if tail:
new_nodes.append(nodes.Text(tail))
# find and remove the original node, and insert all new nodes
# instead
parent.replace(node, new_nodes)
def is_issuetracker_env(
env: t.Any,
) -> "TypeGuard['IssueTrackerBuildEnvironment']":
return hasattr(env, "issuetracker_cache") and env.issuetracker_cache is not None
def lookup_issue(
app: Sphinx, tracker_config: TrackerConfig, issue_id: str
) -> t.Optional[Issue]:
"""
Lookup the given issue.
The issue is first looked up in an internal cache. If it is not found, the
event ``issuetracker-lookup-issue`` is emitted. The result of this
invocation is then cached and returned.
``app`` is the sphinx application object. ``tracker_config`` is the
:class:`TrackerConfig` object representing the issue tracker configuration.
``issue_id`` is a string containing the issue id.
Return a :class:`Issue` object for the issue with the given ``issue_id``,
or ``None`` if the issue wasn't found.
"""
env = app.env
if is_issuetracker_env(env):
cache: IssueTrackerCache = env.issuetracker_cache
if issue_id not in cache:
issue = app.emit_firstresult(
"issuetracker-lookup-issue", tracker_config, issue_id
)
cache[issue_id] = issue
return cache[issue_id]
return None
def lookup_issues(app: Sphinx, doctree: nodes.document) -> None:
"""
Lookup issues found in the given ``doctree``.
Each issue reference in the given ``doctree`` is looked up. Each lookup
result is cached by mapping the referenced issue id to the looked up
:class:`Issue` object (an existing issue) or ``None`` (a missing issue).
The cache is available at ``app.env.issuetracker_cache`` and is pickled
along with the environment.
"""
for node in doctree.traverse(PendingIssueXRef):
if node["reftype"] == "issue":
lookup_issue(app, node["trackerconfig"], node["reftarget"])
def make_issue_reference(issue: Issue, content_node: nodes.inline) -> nodes.reference:
"""
Create a reference node for the given issue.
``content_node`` is a docutils node which is supposed to be added as
content of the created reference. ``issue`` is the :class:`Issue` which
the reference shall point to.
Return a :class:`docutils.nodes.reference` for the issue.
"""
reference = nodes.reference()
reference["refuri"] = issue.url
if issue.title:
reference["reftitle"] = issue.title
if issue.closed:
content_node["classes"].append("closed")
reference.append(content_node)
return reference
def resolve_issue_reference(
app: Sphinx, env: BuildEnvironment, node: PendingIssueXRef, contnode: nodes.inline
) -> t.Optional[nodes.reference]:
"""
Resolve an issue reference and turn it into a real reference to the
corresponding issue.
``app`` and ``env`` are the Sphinx application and environment
respectively. ``node`` is a ``pending_xref`` node representing the missing
reference. It is expected to have the following attributes:
- ``reftype``: The reference type
- ``trackerconfig``: The :class:`TrackerConfig`` to use for this node
- ``reftarget``: The issue id
- ``classes``: The node classes
References with a ``reftype`` other than ``'issue'`` are skipped by
returning ``None``. Otherwise the new node is returned.
If the referenced issue was found, a real reference to this issue is
returned. The text of this reference is formatted with the :class:`Issue`
object available in the ``issue`` key. The reference title is set to the
issue title. If the issue is closed, the class ``closed`` is added to the
new content node.
Otherwise, if the issue was not found, the content node is returned.
"""
if node["reftype"] != "issue":
return None
issue = lookup_issue(app, node["trackerconfig"], node["reftarget"])
if issue is None:
return contnode
else:
classes = contnode["classes"]
conttext = str(contnode[0])
formatted_conttext = nodes.Text(conttext.format(issue=issue))
formatted_contnode = nodes.inline(conttext, formatted_conttext, classes=classes)
assert issue is not None
return make_issue_reference(issue, formatted_contnode)
return None
def init_cache(app: Sphinx) -> None:
if not hasattr(app.env, "issuetracker_cache"):
app.env.issuetracker_cache: "IssueTrackerCache" = {} # type: ignore
return None
def check_project_with_username(tracker_config: TrackerConfig) -> None:
if "/" not in tracker_config.project:
raise ValueError(
"username missing in project name: {0.project}".format(tracker_config)
)
HEADERS = {"User-Agent": "sphinxcontrib-issuetracker v{0}".format("1.0")}
def get(app: Sphinx, url: str) -> t.Optional[requests.Response]:
"""
Get a response from the given ``url``.
``url`` is a string containing the URL to request via GET. ``app`` is the
Sphinx application object.
Return the :class:`~requests.Response` object on status code 200, or
``None`` otherwise. If the status code is not 200 or 404, a warning is
emitted via ``app``.
"""
response = requests.get(url, headers=HEADERS)
if response.status_code == requests.codes.ok:
return response
elif response.status_code != requests.codes.not_found:
msg = "GET {0.url} failed with code {0.status_code}"
logger.warning(msg.format(response))
return None
def lookup_github_issue(
app: Sphinx, tracker_config: TrackerConfig, issue_id: str
) -> t.Optional[Issue]:
check_project_with_username(tracker_config)
env = app.env
if is_issuetracker_env(env):
# Get rate limit information from the environment
timestamp, limit_hit = getattr(env, "github_rate_limit", (0, False))
if limit_hit and time.time() - timestamp > 3600:
# Github limits applications hourly
limit_hit = False
if not limit_hit:
url = GITHUB_API_URL.format(tracker_config, issue_id)
response = get(app, url)
if response:
rate_remaining = response.headers.get("X-RateLimit-Remaining")
assert rate_remaining is not None
if rate_remaining.isdigit() and int(rate_remaining) == 0:
logger.warning("Github rate limit hit")
env.github_rate_limit = (time.time(), True)
issue = response.json()
closed = issue["state"] == "closed"
return Issue(
id=issue_id,
title=issue["title"],
closed=closed,
url=issue["html_url"],
)
else:
logger.warning(
"Github rate limit exceeded, not resolving issue {0}".format(issue_id)
)
return None
BUILTIN_ISSUE_TRACKERS: t.Dict[str, t.Any] = {
"github": lookup_github_issue,
}
def init_transformer(app: Sphinx) -> None:
if app.config.issuetracker_plaintext_issues:
app.add_transform(IssueReferences)
def connect_builtin_tracker(app: Sphinx) -> None:
if app.config.issuetracker:
tracker = BUILTIN_ISSUE_TRACKERS[app.config.issuetracker.lower()]
app.connect(str("issuetracker-lookup-issue"), tracker)
def setup(app: Sphinx) -> t.Dict[str, t.Any]:
app.add_config_value("mybase", "https://github.com/cihai/unihan-etl", "env")
app.add_event(str("issuetracker-lookup-issue"))
app.connect(str("builder-inited"), connect_builtin_tracker)
app.add_config_value("issuetracker", None, "env")
app.add_config_value("issuetracker_project", None, "env")
app.add_config_value("issuetracker_url", None, "env")
# configuration specific to plaintext issue references
app.add_config_value("issuetracker_plaintext_issues", True, "env")
app.add_config_value(
"issuetracker_issue_pattern",
re.compile(
r"#(\d+)",
),
"env",
)
app.add_config_value("issuetracker_title_template", None, "env")
app.connect(str("builder-inited"), init_cache)
app.connect(str("builder-inited"), init_transformer)
app.connect(str("doctree-read"), lookup_issues)
app.connect(str("missing-reference"), resolve_issue_reference)
return {
"version": "1.0",
"parallel_read_safe": True,
"parallel_write_safe": True,
}
Mirrors
https://gist.github.com/tony/05a3043d97d37c158763fb2f6a2d5392
https://github.com/ignatenkobrain/sphinxcontrib-issuetracker/issues/25
Mypy users
mypy --strict docs/_ext/link_issues.py work as of mypy 0.971
If you use mypy: pip install types-docutils types-requests
Install:
https://pypi.org/project/types-docutils/
https://pypi.org/project/types-requests/
https://pypi.org/project/typing-extensions/ (Python <3.10)
Example
via unihan-etl#261 / v0.17.2 (source, view, but page may be outdated)

In a setup with two Nix Flakes, where one provides a plugin for the other's application, "path <…> is not valid". How to fix that?

I have two Nix Flakes: One contains an application, and the other contains a plugin for that application. When I build the application with the plugin, I get the error
error: path '/nix/store/3b7djb5pr87zbscggsr7vnkriw3yp21x-mainapp-go-modules' is not valid
I have no idea what this error means and how to fix it, but I can reproduce it on both macOS and Linux. The path in question is the vendor directory generated by the first step of buildGoModule.
The minimal setup to reproduce the error requires a bunch of files, so I provide a commented bash script that you can execute in an empty folder to recreate my setup:
#!/bin/bash
# I have two flakes: the main application and a plugin.
# the mainapp needs to be inside the plugin directory
# so that nix doesn't complain about the path:mainapp
# reference being outside the parent's root.
mkdir -p plugin/mainapp
# each is a go module with minimal setup
tee plugin/mainapp/go.mod <<EOF >/dev/null
module example.com/mainapp
go 1.16
EOF
tee plugin/go.mod <<EOF >/dev/null
module example.com/plugin
go 1.16
EOF
# each contain minimal Go code
tee plugin/mainapp/main.go <<EOF >/dev/null
package main
import "fmt"
func main() {
fmt.Println("Hello, World!")
}
EOF
tee plugin/main.go <<EOF >/dev/null
package plugin
import log
func init() {
fmt.Println("initializing plugin")
}
EOF
# the mainapp is a flake that provides a function for building
# the app, as well as a default package that is the app
# without any plugins.
tee plugin/mainapp/flake.nix <<'EOF' >/dev/null
{
description = "main application";
inputs = {
nixpkgs.url = github:NixOS/nixpkgs/nixos-21.11;
flake-utils.url = github:numtide/flake-utils;
};
outputs = {self, nixpkgs, flake-utils}:
let
# buildApp builds the application from a list of plugins.
# plugins cause the vendorSha256 to change, hence it is
# given as additional parameter.
buildApp = { pkgs, vendorSha256, plugins ? [] }:
let
# this is appended to the mainapp's go.mod so that it
# knows about the plugin and where to find it.
requirePlugin = plugin: ''
require ${plugin.goPlugin.goModName} v0.0.0
replace ${plugin.goPlugin.goModName} => ${plugin.outPath}/src
'';
# since buildGoModule consumes the source two times –
# first for vendoring, and then for building –
# we do the necessary modifications to the sources in an
# own derivation and then hand that to buildGoModule.
sources = pkgs.stdenvNoCC.mkDerivation {
name = "mainapp-with-plugins-source";
src = self;
phases = [ "unpackPhase" "buildPhase" "installPhase" ];
# write a plugins.go file that references the plugin's package via
# _ = "<module path>"
PLUGINS_GO = ''
package main
// Code generated by Nix. DO NOT EDIT.
import (
${builtins.foldl' (a: b: a + "\n\t_ = \"${b.goPlugin.goModName}\"") "" plugins}
)
'';
GO_MOD_APPEND = builtins.foldl' (a: b: a + "${requirePlugin b}\n") "" plugins;
buildPhase = ''
printenv PLUGINS_GO >plugins.go
printenv GO_MOD_APPEND >>go.mod
'';
installPhase = ''
mkdir -p $out
cp -r -t $out *
'';
};
in pkgs.buildGoModule {
name = "mainapp";
src = builtins.trace "sources at ${sources}" sources;
inherit vendorSha256;
nativeBuildInputs = plugins;
};
in (flake-utils.lib.eachDefaultSystem (system:
let
pkgs = nixpkgs.legacyPackages.${system};
in rec {
defaultPackage = buildApp {
inherit pkgs;
# this may be different depending on your nixpkgs; if it is, just change it.
vendorSha256 = "sha256-pQpattmS9VmO3ZIQUFn66az8GSmB4IvYhTTCFn6SUmo=";
};
}
)) // {
lib = {
inherit buildApp;
# helper that parses a go.mod file for the module's name
pluginMetadata = goModFile: {
goModName = with builtins; head
(match "module ([^[:space:]]+).*" (readFile goModFile));
};
};
};
}
EOF
# the plugin is a flake depending on the mainapp that outputs a plugin package,
# and also a package that is the mainapp compiled with this plugin.
tee plugin/flake.nix <<'EOF' >/dev/null
{
description = "mainapp plugin";
inputs = {
nixpkgs.url = github:NixOS/nixpkgs/nixos-21.11;
flake-utils.url = github:numtide/flake-utils;
nix-filter.url = github:numtide/nix-filter;
mainapp.url = path:mainapp;
mainapp.inputs = {
nixpkgs.follows = "nixpkgs";
flake-utils.follows = "flake-utils";
};
};
outputs = {self, nixpkgs, flake-utils, nix-filter, mainapp}:
flake-utils.lib.eachDefaultSystem (system:
let
pkgs = nixpkgs.legacyPackages.${system};
in rec {
packages = rec {
plugin = pkgs.stdenvNoCC.mkDerivation {
pname = "mainapp-plugin";
version = "0.1.0";
src = nix-filter.lib.filter {
root = ./.;
exclude = [ ./mainapp ./flake.nix ./flake.lock ];
};
# needed for mainapp to recognize this as plugin
passthru.goPlugin = mainapp.lib.pluginMetadata ./go.mod;
phases = [ "unpackPhase" "installPhase" ];
installPhase = ''
mkdir -p $out/src
cp -r -t $out/src *
'';
};
app = mainapp.lib.buildApp {
inherit pkgs;
# this may be different depending on your nixpkgs; if it is, just change it.
vendorSha256 = "sha256-a6HFGFs1Bu9EkXwI+DxH5QY2KBcdPzgP7WX6byai4hw=";
plugins = [ plugin ];
};
};
defaultPackage = packages.app;
}
);
}
EOF
You need Nix with Flake support installed to reproduce the error.
In the plugin folder created by this script, execute
$ nix build
trace: sources at /nix/store/d5arinbiaspyjjc4ypk4h5dsjx22pcsf-mainapp-with-plugins-source
error: path '/nix/store/3b7djb5pr87zbscggsr7vnkriw3yp21x-mainapp-go-modules' is not valid
(If you get hash mismatches, just update the flakes with the correct hash; I am not quite sure whether hashing when spreading flakes outside of a repository is reproducible.)
The sources directory (shown by trace) does exist and looks okay. The path given in the error message also exists and contains modules.txt with expected content.
In the folder mainapp, nix build does run successfully, which builds the app without plugins. So what is it that I do with the plugin that makes the path invalid?
The reason is that the file modules.txt generated as part of vendoring will contain the nix store path in the replace directive in this scenario. The vendor directory is a fixed output derivation and thus must not depend on any other derivations. This is violated by the reference in modules.txt.
This can only be fixed by copying the plugin's sources into the sources derivation – that way, the replace path can be relative and thus references no other nix store path.

GPG not working with python 3.8 but ok with 3.6

I have just started dabbling with Python and I’m stuck with my first project
I need help in trying to make some sense out gpg. I have been struggle with trying to get gpg to work with python 3.8.1. If run the code in Thonny Python 3.6.9 in run just fine.
The version is gpg (GnuPG) 2.2.4 libgcrypt 1.8.1
Home directory : /home/bob/.gnupg
gnupg : /usr/local/lib/python3.8/site-packages/gnupg
using Python 3.6.9 works just fine
#!/usr/bin/python3
from pathlib import Path
import gnupg
# My gpg keys home directory.
#gpg = gnupg.GPG(homedir='/home/bob/.gnupg')
gpg = gnupg.GPG(gnupghome='/home/bob/.gnupg')
local_path = Path("/home/bob")
src_dir = ("/home/bob/Tbox/Channels2.csv")
with open(src_dir, 'rb') as afile:
# text = afile.read()
status = gpg.encrypt_file(afile,
['bobh#gunas.co.uk'],
output='/home/bob/Tbox/Channels2.csv.gpg')
print('ok: ', status.ok)
print('status: ', status.status)
print('stderr: ', status.stderr)
SHELL OUTPUT
ok: True
status: encryption ok
stderr: [GNUPG:] KEY_CONSIDERED 4678A2C439E752DA3DAE2EBA7357BB95381CD73 0
[GNUPG:] KEY_CONSIDERED 4678A2C439E752DA3DAE2EBA7357BB95381CD73 0
[GNUPG:] ENCRYPTION_COMPLIANCE_MODE 23
[GNUPG:] BEGIN_ENCRYPTION 2 9
[GNUPG:] END_ENCRYPTION
however if I run the code in Thonny Python 3.8.1 I not working withy error message in Shell
#!/usr/bin/python3
from pathlib import Path
import gnupg
# My gpg keys home directory.
gpg = gnupg.GPG(homedir='/home/bob/.gnupg')
#gpg = gnupg.GPG(gnupghome='/home/bob/.gnupg')
local_path = Path("/home/bob")
backup_dir = Path("/home/bob/Tbox/tbackup-test")
src_dir = ("/home/bob/Tbox/Channels2.csv")
with open(src_dir, 'rb') as afile:
text = afile.read()
# status = gpg.encrypt_file(text,
status = gpg.encrypt(afile,
['bobh#gunas.co.uk'],
output='/home/bob/Tbox/Channels2.csv.gpg')
print('ok: ', status.ok)
print('status: ', status.status)
print('stderr: ', status.stderr)
SHELL OUTPUT
ok: False
status: None
stderr: gpg: Sorry, no terminal at all requested - can't get input
I have tried add the line no-tty to the gpg.conf file but this did not help.
I have tried with some example of the net but with on joy, one problem I have found is to do with gpg and the word Context like c = gpg.core.Context(armor=True) error AttributeError: 'GPG' object has no attribute 'core'.
In the second example, instead of:
status = gpg.encrypt(afile,
you probably need:
status = gpg.encrypt(text,
Basically you need to decide if you are encrypting a file, or the contents of a file (that you're reading in variable 'text'), and then you either use gpg.encrypt or gpg.encrypt_file accordingly.

'No such file or directory' error when using buildGoPackage in nix

I'm trying to build the hasura cli: https://github.com/hasura/graphql-engine/tree/master/cli with the following code (deps derived from dep2nix):
{ buildGoPackage, fetchFromGitHub }:
buildGoPackage rec {
version = "1.0.0-beta.2";
name = "hasura-${version}";
goPackagePath = "github.com/hasura/graphql-engine";
subPackages = [ "cli" ];
src = fetchFromGitHub {
owner = "hasura";
repo = "graphql-engine";
rev = "v${version}";
sha256 = "1b40s41idkp1nyb9ygxgsvrwv8rsll6dnwrifpn25bvnfk8idafr";
};
goDeps = ./deps.nix;
}
but I get the following errors after the post-installation fixup step:
find: '/nix/store/gkck68cm2z9k1qxgmh350pq3kwsbyn8q-hasura-cli-1.0.0-beta.2': No such file or directory.
What am I doing wrong here? For reference, I'm on macOS and using home-manager.
For anyone still wondering:
There are a couple of things to consider:
dep has been deprecated in favor of go modules
This is also reflected in Nix, as buildGoPackage is now legacy and moved to buildGoModule
There is already a hasura-cli package in nixpkgs. You can just use it with nix-shell -p hasura-cli

cx_Freeze '#rpath/libQtDeclarative.4.dylib': doesn't exist or not a regular file

from cx_Freeze import setup, Executable
# Dependencies are automatically detected, but it might need
# fine tuning.
buildOptions = dict(packages = ["idna","lib","gui","plugins"], excludes = ["Tcl","tcl"]
import sys
base = 'Win32GUI' if sys.platform=='win32' else None
executables = [
Executable('electrum-xvg', base=base, targetName = 'Electrum XVG',icon="electrum.icns")]
setup(name='electrum-xvg',
version = '1.0',
description = '',
options = dict(build_exe = buildOptions),
executables = executables])
I have the above setup.py file which I am using to try build application on OSX Sierra. But when I use python setup.py bdist_mac it raises error
#rpath/libQtDeclarative.4.dylib
error: can't copy '#rpath/libQtDeclarative.4.dylib': doesn't exist or not a regular file
libQtDeclarative.4.dylib is present in ~/anaconda/envs/pyqtapp/lib on my system but when I used otool -D libQtDeclarative.4.dylib it raised error that no such file exists, so I used
install_name_tool -id "#rpath/libQtDeclarative.4.dylib" libQtDeclarative.4.dylib
in ~/anaconda/envs/pyqtapp/lib now when I run otool -D libQtDeclarative.4.dylib I get
libQtDeclarative.4.dylib:
#rpath/libQtDeclarative.4.dylib
but cx_Freeze still raises the error
error: can't copy '#rpath/libQtDeclarative.4.dylib': doesn't exist or not a regular file
Try explicitly setting includes (list of relative paths):
includefiles = ['README.txt', 'CHANGELOG.txt', 'helpers\uncompress\unRAR.exe', , 'helpers\uncompress\unzip.exe']

Resources