Welcome to RequirementsLib’s documentation!¶
RequirementsLib: Requirement Management Library for Pip and Pipenv¶
RequirementsLib: Requirement Management Library for Pip and Pipenv
🐉 Installation¶
Install from PyPI:
$ pipenv install requirementslib
Install from Github:
$ pipenv install -e git+https://github.com/sarugaku/requirementslib.git#egg=requirementslib
🐉 Summary¶
RequirementsLib provides a simple layer for building and interacting with requirements in both the Pipfile format and the requirements.txt format. This library was originally built for converting between these formats in Pipenv.
🐉 Usage¶
Importing a lockfile into your setup.py file¶
You can use RequirementsLib to import your lockfile into your setup file for including your install_requires dependencies:
from requirementslib import Lockfile
lockfile = Lockfile.create('/path/to/project/dir')
install_requires = lockfile.as_requirements(dev=False)
Interacting with a Pipfile directly¶
You can also interact directly with a Pipfile:
>>> from requirementslib import Pipfile
>>> pf = Pipfile.load('/home/hawk/git/pypa-pipenv')
>>> pf.sections
[Section(name='packages', requirements=[]), Section(name='dev-packages', requirements=[Requirement(name='pipenv', vcs=None, req=FileRequirement(setup_path=None, path='.', editable=True, uri='file:///home/hawk/git/pypa-pipenv', link=<Link file:///home/hawk/git/pypa-pipenv>, name='pipenv', req=<Requirement: "-e file:///home/hawk/git/pypa-pipenv">), markers='', specifiers=None, index=None, editable=True, hashes=[], extras=None),...]
And you can even write it back out into Pipfile’s native format:
>>> print(pf.dump(to_dict=False))
[packages]
[dev-packages]
pipenv = {path = ".", editable = true}
flake8 = ">=3.3.0,<4"
pytest = "*"
mock = "*"
[scripts]
tests = "bash ./run-tests.sh"
[pipenv]
allow_prereleases = true
Create a requirement object from requirements.txt format¶
>>> from requirementslib import Requirement
>>> r = Requirement.from_line('-e git+https://github.com/pypa/pipenv.git@master#egg=pipenv')
>>> print(r)
Requirement(name='pipenv', vcs='git', req=VCSRequirement(editable=True, uri='git+https://github.com/pypa/pipenv.git', path=None, vcs='git', ref='master', subdirectory=None, name='pipenv', link=<Link git+https://github.com/pypa/pipenv.git@master#egg=pipenv>, req=<Requirement: "-e git+https://github.com/pypa/pipenv.git@master#egg=pipenv">), markers=None, specifiers=None, index=None, editable=True, hashes=[], extras=[])
>>> r.as_pipfile()
{'pipenv': {'editable': True, 'ref': 'master', 'git': 'https://github.com/pypa/pipenv.git'}}
Or move from Pipfile format to requirements.txt:
>>> r = Requirement.from_pipfile(name='pythonfinder', indexes=[], pipfile={'path': '../pythonfinder', 'editable': True})
>>> r.as_line()
'-e ../pythonfinder'
Resolving Editable Package Dependencies¶
Requirementslib also can resolve the dependencies of editable packages by calling the run_requires
method.
This method returns a detailed dictionary containing metadata parsed from the package built in
a transient folder (unless it is already on the system or the call is run in a virtualenv).
The output of run_requires
is very detailed and in most cases will be sufficient:
>>> from pprint import pprint
>>> from requirementslib.models.requirements import Requirement
>>> r = Requirement.from_line("-e git+git@github.com:sarugaku/vistir.git#egg=vistir[spinner]")
>>> setup_info_dict = r.run_requires()
>>> from pprint import pprint
>>> pprint(setup_info_dict)
{'base_dir': '/tmp/requirementslib-t_ftl6no-src/src/vistir',
'build_backend': 'setuptools.build_meta',
'build_requires': ['setuptools>=36.2.2', 'wheel>=0.28.0'],
'extra_kwargs': {'build_dir': '/tmp/requirementslib-t_ftl6no-src/src',
'download_dir': '/home/hawk/.cache/pipenv/pkgs',
'src_dir': '/tmp/requirementslib-t_ftl6no-src/src',
'wheel_download_dir': '/home/hawk/.cache/pipenv/wheels'},
'extras': {'spinner': [Requirement.parse('cursor'),
Requirement.parse('yaspin')],
'tests': [Requirement.parse('pytest'),
Requirement.parse('pytest-xdist'),
Requirement.parse('pytest-cov'),
Requirement.parse('pytest-timeout'),
Requirement.parse('hypothesis-fspaths'),
Requirement.parse('hypothesis')]},
'ireq': <InstallRequirement object: vistir[spinner] from git+ssh://git@github.com/sarugaku/vistir.git#egg=vistir editable=True>,
'name': 'vistir',
'pyproject': PosixPath('/tmp/requirementslib-t_ftl6no-src/src/vistir/pyproject.toml'),
'python_requires': '>=2.6,!=3.0,!=3.1,!=3.2,!=3.3',
'requires': {'backports.functools_lru_cache;python_version<="3.4"': Requirement.parse('backports.functools_lru_cache; python_version <= "3.4"'),
'backports.shutil_get_terminal_size;python_version<"3.3"': Requirement.parse('backports.shutil_get_terminal_size; python_version < "3.3"'),
'backports.weakref;python_version<"3.3"': Requirement.parse('backports.weakref; python_version < "3.3"'),
'colorama': Requirement.parse('colorama'),
'pathlib2;python_version<"3.5"': Requirement.parse('pathlib2; python_version < "3.5"'),
'requests': Requirement.parse('requests'),
'six': Requirement.parse('six'),
'spinner': [Requirement.parse('cursor'),
Requirement.parse('yaspin')]},
'setup_cfg': PosixPath('/tmp/requirementslib-t_ftl6no-src/src/vistir/setup.cfg'),
'setup_py': PosixPath('/tmp/requirementslib-t_ftl6no-src/src/vistir/setup.py')}
As a side-effect of calls to run_requires
, new metadata is made available on the
requirement itself via the property requirement.req.dependencies
:
>>> pprint(r.req.dependencies)
({'backports.functools_lru_cache;python_version<="3.4"': Requirement.parse('backports.functools_lru_cache; python_version <= "3.4"'),
'backports.shutil_get_terminal_size;python_version<"3.3"': Requirement.parse('backports.shutil_get_terminal_size; python_version < "3.3"'),
'backports.weakref;python_version<"3.3"': Requirement.parse('backports.weakref; python_version < "3.3"'),
'colorama': Requirement.parse('colorama'),
'pathlib2;python_version<"3.5"': Requirement.parse('pathlib2; python_version < "3.5"'),
'requests': Requirement.parse('requests'),
'six': Requirement.parse('six'),
'spinner': [Requirement.parse('cursor'), Requirement.parse('yaspin')]},
[],
['setuptools>=36.2.2', 'wheel>=0.28.0'])
🐉 Contributing¶
- Fork the repository and clone the fork to your local machine:
git clone git@github.com:yourusername/requirementslib.git
- Move into the repository directory and update the submodules:
git submodule update --init --recursive
- Install the package locally in a virtualenv using pipenv:
pipenv install --dev
- You can also install the package into a virtualenv by running
pip install -e .[dev,tests,typing]
to ensure all the development and test dependencies are installed
- You can also install the package into a virtualenv by running
- Before making any changes to the code, make sure to file an issue. The best way to ensure a smooth collaboration is to communicate before investing significant time and energy into any changes! Make sure to consider not just your own use case but others who might be using the library
- Create a new branch. For bugs, you can simply branch to
bugfix/<issuenumber>
. Features can be branched tofeature/<issuenumber>
. This convention is to streamline the branching process and to encourage good practices around filing issues and associating pull requests with specific issues. If you find yourself addressing many issues in one pull request, that should give you pause - Make your desired changes. Don’t forget to add additional tests to account for your new code – continuous integration will fail without it
- Test your changes by running
pipenv run pytest -ra tests
or simplypytest -ra tests
if you are inside an activated virtual environment - Create a corresponding
.rst
file in thenews
directory with a one sentence description of your change, e.g.Resolved an issue which sometimes prevented requirements from being converted from Pipfile entries to pip lines correctly
- Commit your changes. The first line of your commit should be a summary of your changes, no longer than 72 characters, followed by a blank line, followed by a bulleted description of your changes.
Don’t forget to add separate lines with the phrase
- Fixes #<issuenumber>
for each issue you are addressing in your pull request - Before submitting your pull request, make sure to
git remote add upstream git@github.com:sarugaku/requirementslib.git
and thengit fetch upstream && git pull upstream master
to ensure your code is in sync with the latest version of the master branch, - Create a pull request describing your fix, referencing the issues in question. If your commit message from step 8 was detailed, you should be able to copy and paste it
requirementslib package¶
-
class
requirementslib.
Lockfile
(path: pathlib.Path = NOTHING, requirements: list = NOTHING, dev_requirements: list = NOTHING, projectfile: requirementslib.models.project.ProjectFile = NOTHING, lockfile: plette.lockfiles.Lockfile = NOTHING, newlines: str = 'n')[source]¶ Bases:
object
-
as_requirements
(include_hashes=False, dev=False)[source]¶ Returns a list of requirements in pip-style format
-
default
¶
-
dev_requirements
¶
-
dev_requirements_list
¶
-
develop
¶
-
extended_keys
¶
-
classmethod
from_data
(path, data, meta_from_project=True)[source]¶ Create a new lockfile instance from a dictionary.
Parameters:
-
get_requirements
(dev=True, only=False)[source]¶ Produces a generator which generates requirements from the desired section.
Parameters: dev (bool) – Indicates whether to use dev requirements, defaults to False Returns: Requirements from the relevant the relevant pipfile Return type: Requirement
-
classmethod
load
(path, create=True)[source]¶ Create a new lockfile instance.
Parameters: - project_path (str or
pathlib.Path
) – Path to project root or lockfile - lockfile_name (str) – Name of the lockfile in the project root directory
- pipfile_path (
pathlib.Path
) – Path to the project pipfile
Returns: A new lockfile representing the supplied project paths
Return type: - project_path (str or
-
classmethod
load_projectfile
(path, create=True, data=None)[source]¶ Given a path, load or create the necessary lockfile.
Parameters: Raises: - OSError – Thrown if the project root directory doesn’t exist
- FileNotFoundError – Thrown if the lockfile doesn’t exist and
create=False
Returns: A project file instance for the supplied project
Return type:
-
lockfile
¶
-
newlines
¶
-
path
¶
-
projectfile
¶
-
classmethod
read_projectfile
(path)[source]¶ Read the specified project file and provide an interface for writing/updating.
Parameters: path (str) – Path to the target file. Returns: A project file with the model and location for interaction Return type: ProjectFile
-
requirements
¶
-
requirements_list
¶
-
section_keys
¶
-
-
class
requirementslib.
Pipfile
(path: pathlib.Path = NOTHING, projectfile: requirementslib.models.project.ProjectFile = NOTHING, pipfile: requirementslib.models.pipfile.PipfileLoader = NOTHING, pyproject: tomlkit.toml_document.TOMLDocument = NOTHING, build_system: dict = NOTHING, requirements: list = NOTHING, dev_requirements: list = NOTHING)[source]¶ Bases:
object
-
allow_prereleases
¶
-
build_backend
¶
-
build_requires
¶
-
build_system
¶
-
dev_packages
¶
-
dev_requirements
¶
-
extended_keys
¶
-
classmethod
load
(path, create=False)[source]¶ Given a path, load or create the necessary pipfile.
Parameters: - path (Text) – Path to the project root or pipfile
- create (bool) – Whether to create the pipfile if not found, defaults to True
Raises: - OSError – Thrown if the project root directory doesn’t exist
- FileNotFoundError – Thrown if the pipfile doesn’t exist and
create=False
Returns: A pipfile instance pointing at the supplied project
:rtype:: class:~requirementslib.models.pipfile.Pipfile
-
classmethod
load_projectfile
(path, create=False)[source]¶ Given a path, load or create the necessary pipfile.
Parameters: - path (Text) – Path to the project root or pipfile
- create (bool) – Whether to create the pipfile if not found, defaults to True
Raises: - OSError – Thrown if the project root directory doesn’t exist
- FileNotFoundError – Thrown if the pipfile doesn’t exist and
create=False
Returns: A project file instance for the supplied project
Return type:
-
packages
¶
-
path
¶
-
pipfile
¶
-
projectfile
¶
-
classmethod
read_projectfile
(path)[source]¶ Read the specified project file and provide an interface for writing/updating.
Parameters: path (Text) – Path to the target file. Returns: A project file with the model and location for interaction Return type: ProjectFile
-
requirements
¶
-
requires_python
¶
-
root
¶
-
-
class
requirementslib.
Requirement
(name=NOTHING, vcs=None, req=None, markers=None, specifiers=NOTHING, index=None, editable=None, hashes=NOTHING, extras=NOTHING, abstract_dep=None, line_instance=None, ireq=None)[source]¶ Bases:
object
-
as_line
(sources=None, include_hashes=True, include_extras=True, include_markers=True, as_list=False)[source]¶ Format this requirement as a line in requirements.txt.
If
sources
provided, it should be an sequence of mappings, containing all possible sources to be used for this requirement.If
sources
is omitted or falsy, no index information will be included in the requirement line.
-
build_backend
¶
-
commit_hash
¶
-
constraint_line
¶
-
extras_as_pip
¶
-
find_all_matches
(sources=None, finder=None)[source]¶ Find all matching candidates for the current requirement.
Consults a finder to find all matching candidates.
Parameters: - sources – Pipfile-formatted sources, defaults to None
- sources – list[dict], optional
- finder (PackageFinder) – A PackageFinder instance from pip’s repository implementation
Returns: A list of Installation Candidates
Return type: list[
InstallationCandidate
]
-
get_abstract_dependencies
(sources=None)[source]¶ Retrieve the abstract dependencies of this requirement.
Returns the abstract dependencies of the current requirement in order to resolve.
Parameters: - sources – A list of sources (pipfile format), defaults to None
- sources – list, optional
Returns: A list of abstract (unpinned) dependencies
Return type: list[
AbstractDependency
]
-
get_dependencies
(sources=None)[source]¶ Retrieve the dependencies of the current requirement.
Retrieves dependencies of the current requirement. This only works on pinned requirements.
Parameters: - sources – Pipfile-formatted sources, defaults to None
- sources – list[dict], optional
Returns: A set of requirement strings of the dependencies of this requirement.
Return type:
-
hashes_as_pip
¶
-
ireq
¶
-
is_direct_url
¶
-
is_file_or_url
¶
-
is_named
¶
-
is_vcs
¶
-
is_wheel
¶
-
line_instance
¶
-
markers_as_pip
¶
-
name
¶
-
normalized_name
¶
-
pipfile_entry
¶
-
requirement
¶
-
specifiers
¶
-
uses_pep517
¶
-
Submodules¶
requirementslib.models package¶
Submodules¶
requirementslib.models.cache module¶
-
class
requirementslib.models.cache.
DependencyCache
(cache_dir=None)[source]¶ Bases:
object
Creates a new persistent dependency cache for the current Python version. The cache file is written to the appropriate user cache dir for the current platform, i.e.
~/.cache/pip-tools/depcache-pyX.Y.jsonWhere X.Y indicates the Python version.
-
as_cache_key
(ireq)[source]¶ Given a requirement, return its cache key. This behavior is a little weird in order to allow backwards compatibility with cache files. For a requirement without extras, this will return, for example:
(“ipython”, “2.1.0”)
For a requirement with extras, the extras will be comma-separated and appended to the version, inside brackets, like so:
(“ipython”, “2.1.0[nbconvert,notebook]”)
-
cache
¶ The dictionary that is the actual in-memory cache. This property lazily loads the cache from disk.
-
reverse_dependencies
(ireqs)[source]¶ Returns a lookup table of reverse dependencies for all the given ireqs.
Since this is all static, it only works if the dependency cache contains the complete data, otherwise you end up with a partial view. This is typically no problem if you use this function after the entire dependency tree is resolved.
-
-
class
requirementslib.models.cache.
HashCache
(*args, **kwargs)[source]¶ Bases:
pip._internal.network.cache.SafeFileCache
Caches hashes of PyPI artifacts so we do not need to re-download them.
Hashes are only cached when the URL appears to contain a hash in it and the cache key includes the hash value returned from the server). This ought to avoid ssues where the location on the server changes.
requirementslib.models.dependencies module¶
-
class
requirementslib.models.dependencies.
AbstractDependency
(name, specifiers, markers, candidates, requirement, parent, finder, dep_dict=NOTHING)[source]¶ Bases:
object
-
compatible_abstract_dep
(other)[source]¶ Merge this abstract dependency with another one.
Return the result of the merge as a new abstract dependency.
Parameters: other ( AbstractDependency
) – An abstract dependency to merge withReturns: A new, combined abstract dependency Return type: AbstractDependency
-
compatible_versions
(other)[source]¶ Find compatible version numbers between this abstract dependency and another one.
Parameters: other ( AbstractDependency
) – An abstract dependency to compare with.Returns: A set of compatible version strings Return type: set(str)
-
classmethod
from_requirement
(requirement, parent=None)[source]¶ Creates a new
AbstractDependency
from aRequirement
object.This class is used to find all candidates matching a given set of specifiers and a given requirement.
Parameters: requirement ( Requirement
object.) – A requirement for resolution
-
-
requirementslib.models.dependencies.
find_all_matches
(finder, ireq, pre=False)[source]¶ Find all matching dependencies using the supplied finder and the given ireq.
Parameters: - finder (
PackageFinder
) – A package finder for discovering matching candidates. - ireq (
InstallRequirement
) – An install requirement.
Returns: A list of matching candidates.
Return type: list[
InstallationCandidate
]- finder (
-
requirementslib.models.dependencies.
get_abstract_dependencies
(reqs, sources=None, parent=None)[source]¶ Get all abstract dependencies for a given list of requirements.
Given a set of requirements, convert each requirement to an Abstract Dependency.
Parameters: - reqs (list[
Requirement
]) – A list of Requirements - sources – Pipfile-formatted sources, defaults to None
- sources – list[dict], optional
- parent – The parent of this list of dependencies, defaults to None
- parent –
Requirement
, optional
Returns: A list of Abstract Dependencies
Return type: list[
AbstractDependency
]- reqs (list[
-
requirementslib.models.dependencies.
get_dependencies
(ireq, sources=None, parent=None)[source]¶ Get all dependencies for a given install requirement.
Parameters: Returns: A set of dependency lines for generating new InstallRequirements.
Return type:
-
requirementslib.models.dependencies.
get_dependencies_from_cache
(ireq)[source]¶ Retrieves dependencies for the given install requirement from the dependency cache.
Parameters: ireq ( InstallRequirement
) – A single InstallRequirementReturns: A set of dependency lines for generating new InstallRequirements. Return type: set(str) or None
-
requirementslib.models.dependencies.
get_dependencies_from_index
(dep, sources=None, pip_options=None, wheel_cache=None)[source]¶ Retrieves dependencies for the given install requirement from the pip resolver.
Parameters: Returns: A set of dependency lines for generating new InstallRequirements.
Return type:
-
requirementslib.models.dependencies.
get_dependencies_from_json
(ireq)[source]¶ Retrieves dependencies for the given install requirement from the json api.
Parameters: ireq ( InstallRequirement
) – A single InstallRequirementReturns: A set of dependency lines for generating new InstallRequirements. Return type: set(str) or None
-
requirementslib.models.dependencies.
get_dependencies_from_wheel_cache
(ireq)[source]¶ Retrieves dependencies for the given install requirement from the wheel cache.
Parameters: ireq ( InstallRequirement
) – A single InstallRequirementReturns: A set of dependency lines for generating new InstallRequirements. Return type: set(str) or None
-
requirementslib.models.dependencies.
get_finder
(sources=None, pip_command=None, pip_options=None)[source]¶ Get a package finder for looking up candidates to install
Parameters: - sources – A list of pipfile-formatted sources, defaults to None
- sources – list[dict], optional
- pip_command (
Command
) – A pip command instance, defaults to None - pip_options (
cmdoptions
) – A pip options, defaults to None
Returns: A package finder
Return type: PackageFinder
-
requirementslib.models.dependencies.
get_pip_options
(args=[], sources=None, pip_command=None)[source]¶ Build a pip command from a list of sources
Parameters: - args – positional arguments passed through to the pip parser
- sources – A list of pipfile-formatted sources, defaults to None
- sources – list[dict], optional
- pip_command (
Command
) – A pre-built pip command instance
Returns: An instance of pip_options using the supplied arguments plus sane defaults
Return type: cmdoptions
-
requirementslib.models.dependencies.
start_resolver
(finder=None, session=None, wheel_cache=None)[source]¶ Context manager to produce a resolver.
Parameters: finder ( PackageFinder
) – A package finder to use for searching the index:param
Session
session: A session instance :paramWheelCache
wheel_cache: A pip WheelCache instance :return: A 3-tuple of finder, preparer, resolver :rtype: (RequirementPreparer
,Resolver
)
requirementslib.models.lockfile module¶
-
class
requirementslib.models.lockfile.
Lockfile
(path: pathlib.Path = NOTHING, requirements: list = NOTHING, dev_requirements: list = NOTHING, projectfile: requirementslib.models.project.ProjectFile = NOTHING, lockfile: plette.lockfiles.Lockfile = NOTHING, newlines: str = 'n')[source]¶ Bases:
object
-
as_requirements
(include_hashes=False, dev=False)[source]¶ Returns a list of requirements in pip-style format
-
default
¶
-
dev_requirements
¶
-
dev_requirements_list
¶
-
develop
¶
-
extended_keys
¶
-
classmethod
from_data
(path, data, meta_from_project=True)[source]¶ Create a new lockfile instance from a dictionary.
Parameters:
-
get_requirements
(dev=True, only=False)[source]¶ Produces a generator which generates requirements from the desired section.
Parameters: dev (bool) – Indicates whether to use dev requirements, defaults to False Returns: Requirements from the relevant the relevant pipfile Return type: Requirement
-
classmethod
load
(path, create=True)[source]¶ Create a new lockfile instance.
Parameters: - project_path (str or
pathlib.Path
) – Path to project root or lockfile - lockfile_name (str) – Name of the lockfile in the project root directory
- pipfile_path (
pathlib.Path
) – Path to the project pipfile
Returns: A new lockfile representing the supplied project paths
Return type: - project_path (str or
-
classmethod
load_projectfile
(path, create=True, data=None)[source]¶ Given a path, load or create the necessary lockfile.
Parameters: Raises: - OSError – Thrown if the project root directory doesn’t exist
- FileNotFoundError – Thrown if the lockfile doesn’t exist and
create=False
Returns: A project file instance for the supplied project
Return type:
-
lockfile
¶
-
newlines
¶
-
path
¶
-
projectfile
¶
-
classmethod
read_projectfile
(path)[source]¶ Read the specified project file and provide an interface for writing/updating.
Parameters: path (str) – Path to the target file. Returns: A project file with the model and location for interaction Return type: ProjectFile
-
requirements
¶
-
requirements_list
¶
-
section_keys
¶
-
requirementslib.models.markers module¶
-
class
requirementslib.models.markers.
PipenvMarkers
(os_name=None, sys_platform=None, platform_machine=None, platform_python_implementation=None, platform_release=None, platform_system=None, platform_version=None, python_version=None, python_full_version=None, implementation_name=None, implementation_version=None)[source]¶ Bases:
object
System-level requirements - see PEP508 for more detail
-
line_part
¶
-
pipfile_part
¶
-
-
requirementslib.models.markers.
contains_extra
[source]¶ Check whehter a marker contains an “extra == …” operand.
-
requirementslib.models.markers.
contains_pyversion
[source]¶ Check whether a marker contains a python_version operand.
-
requirementslib.models.markers.
get_contained_extras
[source]¶ Collect “extra == …” operands from a marker.
Returns a list of str. Each str is a speficied extra in this marker.
-
requirementslib.models.markers.
get_contained_pyversions
[source]¶ Collect all python_version operands from a marker.
-
requirementslib.models.markers.
get_without_extra
(marker)[source]¶ Build a new marker without the extra == … part.
The implementation relies very deep into packaging’s internals, but I don’t have a better way now (except implementing the whole thing myself).
This could return None if the extra == … part is the only one in the input marker.
-
requirementslib.models.markers.
get_without_pyversion
(marker)[source]¶ Built a new marker without the python_version part.
This could return None if the python_version section is the only section in the marker.
-
requirementslib.models.markers.
normalize_specifier_set
(specs)[source]¶ Given a specifier set, a string, or an iterable, normalize the specifiers
Note
This function exists largely to deal with
pyzmq
which handles therequires_python
specifier incorrectly, using3.7*
rather than the correct form of3.7.*
. This workaround can likely go away if we ever introduce enforcement for metadata standards on PyPI.Parameters: SpecifierSet] specs (Union[str,) – Supplied specifiers to normalize Returns: A new set of specifiers or specifierset Return type: Union[Set[Specifier], SpecifierSet
]
requirementslib.models.metadata module¶
-
class
requirementslib.models.metadata.
Dependency
(name: str, requirement: packaging.requirements.Requirement, specifier, extras=NOTHING, from_extras=None, python_version='', parent=None, markers=None, specset_str: str = '', python_version_str: str = '', marker_str: str = '')[source]¶ Bases:
object
-
extras
= None¶ Any extras this dependency declares
-
from_extras
= None¶ The name of the extra meta-dependency this one came from (e.g. ‘security’)
-
markers
= None¶ The markers for this dependency
-
name
= None¶ The name of the dependency
-
parent
= None¶ The parent of this dependency (i.e. where it came from)
-
python_version
= None¶ The declared specifier set of allowable python versions for this dependency
-
requirement
= None¶ A requirement instance
-
specifier
= None¶ The specifier defined in the dependency definition
-
-
class
requirementslib.models.metadata.
Digest
(algorithm: str, value: str)[source]¶ Bases:
object
-
algorithm
= None¶ The algorithm declared for the digest, e.g. ‘sha256’
-
value
= None¶ The digest value
-
-
class
requirementslib.models.metadata.
ExtrasCollection
(name: str, parent: Dependency, dependencies=NOTHING)[source]¶ Bases:
object
-
dependencies
= None¶ The members of the collection
-
name
= None¶ The name of the extras collection (e.g. ‘security’)
-
parent
= None¶ The dependency the collection belongs to
-
-
class
requirementslib.models.metadata.
Package
(info, last_serial: int, releases, urls=NOTHING)[source]¶ Bases:
object
-
dependencies
¶
-
latest_sdist
¶
-
latest_wheels
¶
-
name
¶
-
requirement
¶
-
version
¶
-
-
class
requirementslib.models.metadata.
PackageEncoder
(*, skipkeys=False, ensure_ascii=True, check_circular=True, allow_nan=True, sort_keys=False, indent=None, separators=None, default=None)[source]¶ Bases:
json.encoder.JSONEncoder
-
default
(obj)[source]¶ Implement this method in a subclass such that it returns a serializable object for
o
, or calls the base implementation (to raise aTypeError
).For example, to support arbitrary iterators, you could implement default like this:
def default(self, o): try: iterable = iter(o) except TypeError: pass else: return list(iterable) # Let the base class default method raise the TypeError return JSONEncoder.default(self, o)
-
-
class
requirementslib.models.metadata.
PackageInfo
(name: str, version: str, package_url: str, summary: str = None, author: str = None, keywords=NOTHING, description: str = '', download_url: str = '', home_page: str = '', license: str = '', maintainer: str = '', maintainer_email: str = '', downloads=NOTHING, docs_url=None, platform: str = '', project_url: str = '', project_urls=NOTHING, requires_python=None, requires_dist=NOTHING, release_url=None, description_content_type: str = 'text/md', bugtrack_url=None, classifiers=NOTHING, author_email=None, markers=None, dependencies=None)[source]¶ Bases:
object
-
create_dependencies
(force=False)[source]¶ Create values for self.dependencies.
Parameters: force (bool) – Sets self.dependencies to an empty tuple if it would be None, defaults to False. Returns: An updated instance of the current object with self.dependencies updated accordingly. Return type: PackageInfo
-
-
class
requirementslib.models.metadata.
ParsedTag
(marker_string=None, python_version=None, platform_system=None, abi=None)[source]¶ Bases:
object
-
abi
= None¶ the ABI represented by the tag
-
marker_string
= None¶ The marker string corresponding to the tag
-
platform_system
= None¶ The platform represented by the tag
-
python_version
= None¶ The python version represented by the tag
-
-
class
requirementslib.models.metadata.
Release
(version: str, urls, name=None)[source]¶ Bases:
collections.abc.Sequence
-
latest
¶
-
latest_timestamp
¶
-
name
= None¶ the name of the package
-
parsed_version
¶
-
sdists
¶
-
urls
= None¶ The URL collection for the release
-
version
= None¶ The version of the release
-
wheels
¶
-
yanked
¶
-
-
class
requirementslib.models.metadata.
ReleaseCollection
(releases=NOTHING)[source]¶ Bases:
object
-
latest
¶
-
non_yanked_releases
¶
-
-
class
requirementslib.models.metadata.
ReleaseUrl
(md5_digest: requirementslib.models.metadata.Digest, packagetype: str, upload_time, upload_time_iso_8601, size: int, url: str, digests, name: str = None, comment_text: str = '', yanked: bool = False, downloads: int = -1, filename: str = '', has_sig: bool = False, python_version: str = 'source', requires_python: str = None, tags=NOTHING)[source]¶ Bases:
object
-
comment_text
= None¶ The available comments of the given upload
-
digests
= None¶ The digests of the package
-
downloads
= None¶ The number of downloads (deprecated)
-
filename
= None¶ The filename of the current upload
-
has_sig
= None¶ Whether the upload has a signature
-
is_sdist
¶
-
is_wheel
¶
-
markers
¶
-
md5_digest
= None¶ The MD5 digest of the given release
-
name
= None¶ The name of the package
-
packagetype
= None¶ The package type of the url
-
pep508_url
¶
-
python_version
= None¶ The python_version attribute of the upload (e.g. ‘source’, ‘py27’, etc)
-
requires_python
= None¶ The ‘requires_python’ restriction on the package
-
sha256
¶
-
size
= None¶ The size in bytes of the package
A list of valid aprsed tags from the upload
-
upload_time
= None¶ The upload timestamp from the package
-
upload_time_iso_8601
= None¶ The ISO8601 formatted upload timestamp of the package
-
url
= None¶ The URL of the package
-
yanked
= None¶ Whether the url has been yanked from the server
-
-
class
requirementslib.models.metadata.
ReleaseUrlCollection
(urls, name=None)[source]¶ Bases:
collections.abc.Sequence
-
find_package_type
(type_)[source]¶ Given a package type (e.g. sdist, bdist_wheel), find the matching release.
Parameters: type (str) – A package type from PACKAGE_TYPES
Returns: The package from this collection matching that type, if available Return type: Optional[ReleaseUrl]
-
latest
¶
-
latest_timestamp
¶
-
name
= None¶ the name of the package
-
sdists
¶
-
urls
= None¶ A list of release URLs
-
wheels
¶
-
-
requirementslib.models.metadata.
instance_check_converter
(expected_type=None, converter=None)[source]¶
requirementslib.models.pipfile module¶
-
class
requirementslib.models.pipfile.
Pipfile
(path: pathlib.Path = NOTHING, projectfile: requirementslib.models.project.ProjectFile = NOTHING, pipfile: requirementslib.models.pipfile.PipfileLoader = NOTHING, pyproject: tomlkit.toml_document.TOMLDocument = NOTHING, build_system: dict = NOTHING, requirements: list = NOTHING, dev_requirements: list = NOTHING)[source]¶ Bases:
object
-
allow_prereleases
¶
-
build_backend
¶
-
build_requires
¶
-
build_system
¶
-
dev_packages
¶
-
dev_requirements
¶
-
extended_keys
¶
-
classmethod
load
(path, create=False)[source]¶ Given a path, load or create the necessary pipfile.
Parameters: - path (Text) – Path to the project root or pipfile
- create (bool) – Whether to create the pipfile if not found, defaults to True
Raises: - OSError – Thrown if the project root directory doesn’t exist
- FileNotFoundError – Thrown if the pipfile doesn’t exist and
create=False
Returns: A pipfile instance pointing at the supplied project
:rtype:: class:~requirementslib.models.pipfile.Pipfile
-
classmethod
load_projectfile
(path, create=False)[source]¶ Given a path, load or create the necessary pipfile.
Parameters: - path (Text) – Path to the project root or pipfile
- create (bool) – Whether to create the pipfile if not found, defaults to True
Raises: - OSError – Thrown if the project root directory doesn’t exist
- FileNotFoundError – Thrown if the pipfile doesn’t exist and
create=False
Returns: A project file instance for the supplied project
Return type:
-
packages
¶
-
path
¶
-
pipfile
¶
-
projectfile
¶
-
classmethod
read_projectfile
(path)[source]¶ Read the specified project file and provide an interface for writing/updating.
Parameters: path (Text) – Path to the target file. Returns: A project file with the model and location for interaction Return type: ProjectFile
-
requirements
¶
-
requires_python
¶
-
root
¶
-
-
class
requirementslib.models.pipfile.
PipfileLoader
(data)[source]¶ Bases:
plette.pipfiles.Pipfile
-
classmethod
ensure_package_sections
(data)[source]¶ Ensure that all pipfile package sections are present in the given toml document
- :param
TOMLDocument
data: The toml document to - ensure package sections are present on
Returns: The updated toml document, ensuring packages
anddev-packages
sections are presentReturn type: TOMLDocument
- :param
-
classmethod
requirementslib.models.project module¶
-
class
requirementslib.models.project.
FileDifference
(default, develop)¶ Bases:
tuple
-
default
¶ Alias for field number 0
-
develop
¶ Alias for field number 1
-
-
class
requirementslib.models.project.
Project
(root)[source]¶ Bases:
object
-
difference_lockfile
(lockfile)[source]¶ Generate a difference between the current and given lockfiles.
Returns a 2-tuple containing differences in default in develop sections.
Each element is a 2-tuple of dicts. The first, inthis, contains entries only present in the current lockfile; the second, inthat, contains entries only present in the given one.
If a key exists in both this and that, but the values differ, the key is present in both dicts, pointing to values from each file.
-
lockfile
¶
-
lockfile_location
¶
-
pipfile
¶
-
pipfile_location
¶
-
-
class
requirementslib.models.project.
ProjectFile
(location, line_ending, model)[source]¶ Bases:
object
A file in the Pipfile project.
requirementslib.models.requirements module¶
-
class
requirementslib.models.requirements.
FileRequirement
(setup_path=None, path=None, editable=False, extras=NOTHING, uri_scheme=None, uri=NOTHING, link=NOTHING, pyproject_requires=NOTHING, pyproject_backend=None, pyproject_path=None, subdirectory=None, setup_info=None, has_hashed_name=False, parsed_line=None, name=NOTHING, req=NOTHING)[source]¶ Bases:
object
File requirements for tar.gz installable files or wheels or setup.py containing directories.
-
dependencies
¶
-
editable
¶ Whether the package is editable
-
extras
¶ Extras if applicable
-
formatted_path
¶
-
classmethod
get_link_from_line
(line)[source]¶ Parse link information from given requirement line.
Return a 6-tuple:
- vcs_type indicates the VCS to use (e.g. “git”), or None.
- prefer is either “file”, “path” or “uri”, indicating how the
- information should be used in later stages.
- relpath is the relative path to use when recording the dependency,
- instead of the absolute path/URI used to perform installation. This can be None (to prefer the absolute path or URI).
- path is the absolute file path to the package. This will always use
- forward slashes. Can be None if the line is a remote URI.
- uri is the absolute URI to the package. Can be None if the line is
- not a URI.
- link is an instance of
pip._internal.index.Link
, - representing a URI parse result based on the value of uri.
- link is an instance of
This function is provided to deal with edge cases concerning URIs without a valid netloc. Those URIs are problematic to a straight
urlsplit` call because they cannot be reliably reconstructed with ``urlunsplit
due to a bug in the standard library:>>> from urllib.parse import urlsplit, urlunsplit >>> urlunsplit(urlsplit('git+file:///this/breaks')) 'git+file:/this/breaks' >>> urlunsplit(urlsplit('file:///this/works')) 'file:///this/works'
See https://bugs.python.org/issue23505#msg277350.
-
is_direct_url
¶
-
is_local
¶
-
is_remote_artifact
¶
-
line_part
¶
-
link
¶ Link object representing the package to clone
-
name
¶ Package name
-
parsed_line
¶
-
path
¶ path to hit - without any of the VCS prefixes (like git+ / http+ / etc)
-
pipfile_part
¶
-
pyproject_backend
¶ PyProject Build System
-
pyproject_path
¶ PyProject Path
-
pyproject_requires
¶ PyProject Requirements
-
req
¶ A
Requirement
instance
-
setup_info
¶
-
setup_path
¶ Path to the relevant setup.py location
-
setup_py_dir
¶
-
subdirectory
¶
-
uri
¶ URI of the package
-
-
class
requirementslib.models.requirements.
Line
(line, extras=None)[source]¶ Bases:
object
-
base_path
¶
-
ireq
¶
-
is_artifact
¶
-
is_direct_url
¶
-
is_file
¶
-
is_file_url
¶
-
is_installable
¶
-
is_named
¶
-
is_path
¶
-
is_remote_url
¶
-
is_url
¶
-
is_vcs
¶
-
is_wheel
¶
-
line_for_ireq
¶
-
line_is_installable
¶ This is a safeguard against decoy requirements when a user installs a package whose name coincides with the name of a folder in the cwd, e.g. install alembic when there is a folder called alembic in the working directory.
In this case we first need to check that the given requirement is a valid URL, VCS requirement, or installable filesystem path before deciding to treat it as a file requirement over a named requirement.
-
line_with_prefix
¶
-
link
¶
-
metadata
¶
-
name
¶
-
name_and_specifier
¶
-
parse_extras
()[source]¶ Parse extras from self.line and set them on the current object :returns: self :rtype:
Line
-
parse_hashes
()[source]¶ Parse hashes from self.line and set them on the current object.
Returns: Self Return type: :class:~Line
-
parsed_setup_cfg
¶
-
parsed_setup_py
¶
-
parsed_url
¶
-
pyproject_backend
¶
-
pyproject_requires
¶
-
pyproject_toml
¶
-
ref
¶
-
requirement
¶
-
requirement_info
¶ Generates a 3-tuple of the requisite name, extras and url to generate a
Requirement
out of.Returns: A Tuple of an optional name, a Tuple of extras, and an optional URL. Return type: Tuple[Optional[S], Tuple[Optional[S], ..], Optional[S]]
-
setup_cfg
¶
-
setup_info
¶
-
setup_py
¶
-
specifier
¶
-
specifiers
¶
-
subdirectory
¶
-
url
¶
-
vcsrepo
¶
-
wheel_kwargs
¶
-
-
class
requirementslib.models.requirements.
LinkInfo
(vcs_type, prefer, relpath, path, uri, link)¶ Bases:
tuple
-
link
¶ Alias for field number 5
-
path
¶ Alias for field number 3
-
prefer
¶ Alias for field number 1
-
relpath
¶ Alias for field number 2
-
uri
¶ Alias for field number 4
-
vcs_type
¶ Alias for field number 0
-
-
class
requirementslib.models.requirements.
NamedRequirement
(name, version, req=NOTHING, extras=NOTHING, editable=False, parsed_line=None)[source]¶ Bases:
object
-
editable
¶
-
extras
¶
-
line_part
¶
-
name
¶
-
parsed_line
¶
-
pipfile_part
¶
-
req
¶
-
version
¶
-
-
class
requirementslib.models.requirements.
Requirement
(name=NOTHING, vcs=None, req=None, markers=None, specifiers=NOTHING, index=None, editable=None, hashes=NOTHING, extras=NOTHING, abstract_dep=None, line_instance=None, ireq=None)[source]¶ Bases:
object
-
as_line
(sources=None, include_hashes=True, include_extras=True, include_markers=True, as_list=False)[source]¶ Format this requirement as a line in requirements.txt.
If
sources
provided, it should be an sequence of mappings, containing all possible sources to be used for this requirement.If
sources
is omitted or falsy, no index information will be included in the requirement line.
-
build_backend
¶
-
commit_hash
¶
-
constraint_line
¶
-
extras_as_pip
¶
-
find_all_matches
(sources=None, finder=None)[source]¶ Find all matching candidates for the current requirement.
Consults a finder to find all matching candidates.
Parameters: - sources – Pipfile-formatted sources, defaults to None
- sources – list[dict], optional
- finder (PackageFinder) – A PackageFinder instance from pip’s repository implementation
Returns: A list of Installation Candidates
Return type: list[
InstallationCandidate
]
-
get_abstract_dependencies
(sources=None)[source]¶ Retrieve the abstract dependencies of this requirement.
Returns the abstract dependencies of the current requirement in order to resolve.
Parameters: - sources – A list of sources (pipfile format), defaults to None
- sources – list, optional
Returns: A list of abstract (unpinned) dependencies
Return type: list[
AbstractDependency
]
-
get_dependencies
(sources=None)[source]¶ Retrieve the dependencies of the current requirement.
Retrieves dependencies of the current requirement. This only works on pinned requirements.
Parameters: - sources – Pipfile-formatted sources, defaults to None
- sources – list[dict], optional
Returns: A set of requirement strings of the dependencies of this requirement.
Return type:
-
hashes_as_pip
¶
-
ireq
¶
-
is_direct_url
¶
-
is_file_or_url
¶
-
is_named
¶
-
is_vcs
¶
-
is_wheel
¶
-
line_instance
¶
-
markers_as_pip
¶
-
name
¶
-
normalized_name
¶
-
pipfile_entry
¶
-
requirement
¶
-
specifiers
¶
-
uses_pep517
¶
-
-
class
requirementslib.models.requirements.
VCSRequirement
(setup_path=None, extras=NOTHING, uri_scheme=None, pyproject_requires=NOTHING, pyproject_backend=None, pyproject_path=None, subdirectory=None, setup_info=None, has_hashed_name=False, parsed_line=None, editable=None, uri=None, path=None, vcs=None, ref=None, repo=None, base_line=None, name=NOTHING, link=NOTHING, req=NOTHING)[source]¶ Bases:
requirementslib.models.requirements.FileRequirement
-
editable
¶ Whether the repository is editable
-
line_part
¶ requirements.txt compatible line part sans-extras.
-
link
¶
-
name
¶
-
path
¶ path to the repository, if it’s local
-
pipfile_part
¶
-
ref
¶ vcs reference name (branch / commit / tag)
-
repo
¶
-
req
¶
-
setup_info
¶
-
uri
¶ URI for the repository
-
url
¶
-
vcs
¶ vcs type, i.e. git/hg/svn
-
vcs_uri
¶
-
requirementslib.models.resolvers module¶
-
class
requirementslib.models.resolvers.
DependencyResolver
(pinned_deps=NOTHING, dep_dict=NOTHING, candidate_dict=NOTHING, pin_history=NOTHING, allow_prereleases=False, hashes=NOTHING, hash_cache=NOTHING, finder=None, include_incompatible_hashes=True, available_candidates_cache=NOTHING)[source]¶ Bases:
object
-
add_abstract_dep
(dep)[source]¶ Add an abstract dependency by either creating a new entry or merging with an old one.
Parameters: dep ( AbstractDependency
) – An abstract dependency to addRaises: ResolutionError – Raised when the given dependency is not compatible with an existing abstract dependency.
-
allow_all_wheels
()[source]¶ Monkey patches pip.Wheel to allow wheels from all platforms and Python versions.
This also saves the candidate cache and set a new one, or else the results from the previous non-patched calls will interfere.
-
allow_prereleases
= None¶ Whether to allow prerelease dependencies
-
candidate_dict
= None¶ A dictionary of sets of version numbers that are valid for a candidate currently
-
dep_dict
= None¶ A dictionary of abstract dependencies by name
-
dependencies
¶
-
finder
= None¶ A finder for searching the index
-
hash_cache
= None¶ A hash cache
-
hashes
= None¶ Stores hashes for each dependency
-
include_incompatible_hashes
= None¶ Whether to include hashes even from incompatible wheels
-
pin_deps
()[source]¶ Pins the current abstract dependencies and adds them to the history dict.
Adds any new dependencies to the abstract dependencies already present by merging them together to form new, compatible abstract dependencies.
-
pin_history
= None¶ A historical record of pins
-
resolution
¶
-
resolve
(root_nodes, max_rounds=20)[source]¶ Resolves dependencies using a backtracking resolver and multiple endpoints.
Note: this resolver caches aggressively. Runs for max_rounds or until any two pinning rounds yield the same outcome.
Parameters: - root_nodes (list[
Requirement
]) – A list of the root requirements. - max_rounds – The max number of resolution rounds, defaults to 20
- max_rounds – int, optional
Raises: RuntimeError – Raised when max rounds is exceeded without a resolution.
- root_nodes (list[
-
requirementslib.models.setup_info module¶
-
class
requirementslib.models.setup_info.
Analyzer
[source]¶ Bases:
ast.NodeVisitor
-
class
requirementslib.models.setup_info.
BaseRequirement
(name='', requirement=None)[source]¶ Bases:
object
-
name
¶
-
requirement
¶
-
-
class
requirementslib.models.setup_info.
BuildEnv
(cleanup=True)[source]¶ Bases:
pep517.envbuild.BuildEnvironment
-
class
requirementslib.models.setup_info.
Extra
(name=None, requirements: frozenset = NOTHING)[source]¶ Bases:
object
-
name
¶
-
requirements
¶
-
-
class
requirementslib.models.setup_info.
HookCaller
(source_dir, build_backend, backend_path=None)[source]¶ Bases:
pep517.wrappers.Pep517HookCaller
-
class
requirementslib.models.setup_info.
SetupInfo
(name=None, base_dir=None, version=None, requirements: frozenset = NOTHING, build_requires=None, build_backend=NOTHING, setup_requires=None, python_requires=None, extras_requirements=None, setup_cfg: pathlib.Path = None, setup_py: pathlib.Path = None, pyproject: pathlib.Path = None, ireq=None, extra_kwargs: dict = NOTHING, metadata=None, stack=None, finalizer=None)[source]¶ Bases:
object
-
base_dir
¶
-
build_backend
¶
-
build_requires
¶
-
egg_base
¶
-
extra_kwargs
¶
-
extras
¶
-
get_egg_metadata
(metadata_dir=None, metadata_type=None)[source]¶ Given a metadata directory, return the corresponding metadata dictionary.
Parameters: Returns: A metadata dictionary built from the metadata in the given location
Return type: Dict[Any, Any]
-
get_metadata_from_wheel
(wheel_path)[source]¶ Given a path to a wheel, return the metadata from that wheel.
Returns: A dictionary of metadata from the provided wheel Return type: Dict[Any, Any]
-
ireq
¶
-
metadata
¶
-
name
¶
-
pep517_config
¶
-
populate_metadata
(metadata)[source]¶ Populates the metadata dictionary from the supplied metadata.
Returns: The current instance. Return type: SetupInfo
-
pyproject
¶
-
python_requires
¶
-
reload
()[source]¶ Wipe existing distribution info metadata for rebuilding.
Erases metadata from self.egg_base and unsets self.requirements and self.extras.
-
requires
¶
-
run_pyproject
()[source]¶ Populates the pyproject.toml metadata if available.
Returns: The current instance Return type: SetupInfo
-
setup_cfg
¶
-
setup_py
¶
-
setup_requires
¶
-
stack
¶
-
version
¶
-
-
requirementslib.models.setup_info.
ast_unparse
(item, initial_mapping=False, analyzer=None, recurse=True)[source]¶
-
requirementslib.models.setup_info.
build_pep517
(source_dir, build_dir, config_settings=None, dist_type='wheel')[source]¶
-
requirementslib.models.setup_info.
iter_metadata
(path, pkg_name=None, metadata_type='egg-info')[source]¶
-
requirementslib.models.setup_info.
pep517_subprocess_runner
(cmd, cwd=None, extra_environ=None)[source]¶ The default method of calling the wrapper subprocess.
-
requirementslib.models.setup_info.
run_setup
(script_path, egg_base=None)[source]¶ Run a setup.py script with a target egg_base if provided.
Parameters: - script_path (S) – The path to the setup.py script to run
- egg_base (Optional[S]) – The metadata directory to build in
Raises: FileNotFoundError – If the provided script_path does not exist
Returns: The metadata dictionary
Return type: Dict[Any, Any]
requirementslib.models.utils module¶
-
requirementslib.models.utils.
as_tuple
(ireq)[source]¶ Pulls out the (name: str, version:str, extras:(str)) tuple from the pinned InstallRequirement.
-
requirementslib.models.utils.
build_vcs_uri
(vcs, uri, name=None, ref=None, subdirectory=None, extras=None)[source]¶
-
requirementslib.models.utils.
clean_requires_python
(candidates)[source]¶ Get a cleaned list of all the candidates with valid specifiers in the requires_python attributes.
-
requirementslib.models.utils.
convert_direct_url_to_url
(direct_url)[source]¶ Converts direct URLs to standard, link-style URLs
Given a direct url as defined by PEP 508, convert to a
Link
compatible URL by moving the name and extras into an egg_fragment.Parameters: direct_url (str) – A pep-508 compliant direct url. Returns: A reformatted URL for use with Link objects and InstallRequirement
objects.Return type: AnyStr
-
requirementslib.models.utils.
convert_url_to_direct_url
(url, name=None)[source]¶ Converts normal link-style URLs to direct urls.
Given a
Link
compatible URL, convert to a direct url as defined by PEP 508 by extracting the name and extras from the egg_fragment.Parameters: - url (AnyStr) – A
InstallRequirement
compliant URL. - name (Optiona[AnyStr]) – A name to use in case the supplied URL doesn’t provide one.
Returns: A pep-508 compliant direct url.
Return type: AnyStr
Raises: - ValueError – Raised when the URL can’t be parsed or a name can’t be found.
- TypeError – When a non-string input is provided.
- url (AnyStr) – A
-
requirementslib.models.utils.
expand_env_variables
(line)[source]¶ Expand the env vars in a line following pip’s standard. https://pip.pypa.io/en/stable/reference/pip_install/#id10
Matches environment variable-style values in ‘${MY_VARIABLE_1}’ with the variable name consisting of only uppercase letters, digits or the ‘_’
-
requirementslib.models.utils.
extras_to_string
(extras)[source]¶ Turn a list of extras into a string
Parameters: extras (List[str]]) – a list of extras to format Returns: A string of extras Return type: str
-
requirementslib.models.utils.
flat_map
(fn, collection)[source]¶ Map a function over a collection and flatten the result by one-level
-
requirementslib.models.utils.
format_requirement
(ireq)[source]¶ Formats an InstallRequirement instance as a string.
Generic formatter for pretty printing InstallRequirements to the terminal in a less verbose way than using its __str__ method.
:param
InstallRequirement
ireq: A pip InstallRequirement instance. :return: A formatted string for prettyprinting :rtype: str
-
requirementslib.models.utils.
format_specifier
(ireq)[source]¶ Generic formatter for pretty printing specifiers.
Pretty-prints specifiers from InstallRequirements for output to terminal.
:param
InstallRequirement
ireq: A pip InstallRequirement instance. :return: A string of specifiers in the given install requirement or <any> :rtype: str
-
requirementslib.models.utils.
full_groupby
(iterable, key=None)[source]¶ Like groupby(), but sorts the input on the group key first.
-
requirementslib.models.utils.
get_name_variants
(pkg)[source]¶ Given a packager name, get the variants of its name for both the canonicalized and “safe” forms.
Parameters: pkg (AnyStr) – The package to lookup Returns: A list of names. Return type: Set
-
requirementslib.models.utils.
get_pinned_version
(ireq)[source]¶ Get the pinned version of an InstallRequirement.
An InstallRequirement is considered pinned if:
- Is not editable
- It has exactly one specifier
- That specifier is “==”
- The version does not contain a wildcard
- Examples:
- django==1.8 # pinned django>1.8 # NOT pinned django~=1.8 # NOT pinned django==1.* # NOT pinned
Raises TypeError if the input is not a valid InstallRequirement, or ValueError if the InstallRequirement is not pinned.
-
requirementslib.models.utils.
get_pyproject
(path)[source]¶ Given a base path, look for the corresponding
pyproject.toml
file and return its build_requires and build_backend.Parameters: path (AnyStr) – The root path of the project, should be a directory (will be truncated) Returns: A 2 tuple of build requirements and the build backend Return type: Optional[Tuple[List[AnyStr], AnyStr]]
-
requirementslib.models.utils.
get_url_name
(url)[source]¶ Given a url, derive an appropriate name to use in a pipfile.
Parameters: url (str) – A url to derive a string from Returns: The name of the corresponding pipfile entry Return type: Text
-
requirementslib.models.utils.
is_pinned_requirement
(ireq)[source]¶ Returns whether an InstallRequirement is a “pinned” requirement.
An InstallRequirement is considered pinned if:
- Is not editable
- It has exactly one specifier
- That specifier is “==”
- The version does not contain a wildcard
- Examples:
- django==1.8 # pinned django>1.8 # NOT pinned django~=1.8 # NOT pinned django==1.* # NOT pinned
-
requirementslib.models.utils.
key_from_ireq
(ireq)[source]¶ Get a standardized key for an InstallRequirement.
-
requirementslib.models.utils.
key_from_req
(req)[source]¶ Get an all-lowercase version of the requirement’s name.
-
requirementslib.models.utils.
lookup_table
(values, key=None, keyval=None, unique=False, use_lists=False)[source]¶ Builds a dict-based lookup table (index) elegantly.
Supports building normal and unique lookup tables. For example:
>>> assert lookup_table( ... ['foo', 'bar', 'baz', 'qux', 'quux'], lambda s: s[0]) == { ... 'b': {'bar', 'baz'}, ... 'f': {'foo'}, ... 'q': {'quux', 'qux'} ... }
For key functions that uniquely identify values, set unique=True:
>>> assert lookup_table( ... ['foo', 'bar', 'baz', 'qux', 'quux'], lambda s: s[0], unique=True) == { ... 'b': 'baz', ... 'f': 'foo', ... 'q': 'quux' ... }
The values of the resulting lookup table will be values, not sets.
For extra power, you can even change the values while building up the LUT. To do so, use the keyval function instead of the key arg:
>>> assert lookup_table( ... ['foo', 'bar', 'baz', 'qux', 'quux'], ... keyval=lambda s: (s[0], s[1:])) == { ... 'b': {'ar', 'az'}, ... 'f': {'oo'}, ... 'q': {'uux', 'ux'} ... }
-
requirementslib.models.utils.
make_install_requirement
(name, version=None, extras=None, markers=None, constraint=False)[source]¶ Generates an
InstallRequirement
.Create an InstallRequirement from the supplied metadata.
Parameters: - name (str) – The requirement’s name.
- version (str.) – The requirement version (must be pinned).
- extras (list[str]) – The desired extras.
- markers (str) – The desired markers, without a preceding semicolon.
- constraint – Whether to flag the requirement as a constraint, defaults to False.
- constraint – bool, optional
Returns: A generated InstallRequirement
Return type: InstallRequirement
-
requirementslib.models.utils.
normalize_name
(pkg)[source]¶ Given a package name, return its normalized, non-canonicalized form.
Parameters: pkg (AnyStr) – The name of a package Returns: A normalized package name Return type: AnyStr
-
requirementslib.models.utils.
parse_extras
(extras_str)[source]¶ Turn a string of extras into a parsed extras list
Parameters: extras_str (str) – An extras string Returns: A sorted list of extras Return type: List[str]
-
requirementslib.models.utils.
read_source
(path, encoding='utf-8')[source]¶ Read a source file and get the contents with proper encoding for Python 2/3.
Parameters: - path (AnyStr) – the file path
- encoding (AnyStr) – the encoding that defaults to UTF-8
Returns: The contents of the source file
Return type: AnyStr
-
requirementslib.models.utils.
specs_to_string
(specs)[source]¶ Turn a list of specifier tuples into a string
Parameters: str]] specs (List[Union[Specifier,) – a list of specifiers to format Returns: A string of specifiers Return type: str
-
requirementslib.models.utils.
split_ref_from_uri
(uri)[source]¶ Given a path or URI, check for a ref and split it from the path if it is present, returning a tuple of the original input and the ref or None.
Parameters: uri (AnyStr) – The path or URI to split Returns: A 2-tuple of the path or URI and the ref Return type: Tuple[AnyStr, Optional[AnyStr]]
-
requirementslib.models.utils.
split_vcs_method_from_uri
(uri)[source]¶ Split a vcs+uri formatted uri into (vcs, uri)
-
requirementslib.models.utils.
strip_extras_markers_from_requirement
(req)[source]¶ Strips extras markers from requirement instances.
Given a
Requirement
instance with markers defining extra == ‘name’, strip out the extras from the markers and return the cleaned requirementParameters: req (PackagingRequirement) – A packaging requirement to clean Returns: A cleaned requirement Return type: PackagingRequirement
requirementslib.models.url module¶
-
class
requirementslib.models.url.
URI
(host: str, scheme: str = 'https', port: int = None, path: str = '', query: str = '', fragment: str = '', subdirectory: str = '', ref: str = '', username: str = '', password: str = '', query_dict: orderedmultidict.orderedmultidict.omdict = NOTHING, name: str = '', extras: tuple = NOTHING, is_direct_url: bool = False, is_implicit_ssh: bool = False, auth: str = None, fragment_dict: dict = NOTHING, username_is_quoted: bool = False, password_is_quoted: bool = False)[source]¶ Bases:
object
-
as_link
¶
-
bare_url
¶
-
base_url
¶
-
extras
= None¶ Any extras requested from the requirement
-
fragment
= None¶ URL Fragments, e.g. #fragment=value
-
full_url
¶
-
host
= None¶ The target hostname, e.g. amazon.com
-
is_direct_url
= None¶ Whether the url was parsed as a direct pep508-style URL
-
is_file_url
¶
-
is_implicit_ssh
= None¶ Whether the url was an implicit git+ssh url (passed as git+git@)
-
is_installable
¶
-
is_vcs
¶
-
name
= None¶ The name of the specified package in case it is a VCS URI with an egg fragment
-
name_with_extras
¶
-
password
= None¶ Password parsed from user:password@hostname
-
path
= None¶ The url path, e.g. /path/to/endpoint
-
port
= None¶ The numeric port of the url if specified
-
query
= None¶ Query parameters, e.g. ?variable=value…
-
query_dict
= None¶ An orderedmultidict representing query fragments
-
ref
= None¶ VCS ref this URI points at, if available
-
safe_string
¶
-
scheme
= None¶ The URI Scheme, e.g. salesforce
-
secret
¶
-
subdirectory
= None¶ Subdirectory fragment, e.g. &subdirectory=blah…
-
to_string
(escape_password=True, unquote=True, direct=None, strip_ssh=False, strip_ref=False, strip_name=False, strip_subdir=False)[source]¶ Converts the current URI to a string, unquoting or escaping the password as needed.
Parameters: - escape_password – Whether to replace password with
----
, default True - escape_password – bool, optional
- unquote – Whether to unquote url-escapes in the password, default False
- unquote – bool, optional
- direct (bool) – Whether to format as a direct URL
- strip_ssh (bool) – Whether to strip the SSH scheme from the url (git only)
- strip_ref (bool) – Whether to drop the VCS ref (if present)
- strip_name (bool) – Whether to drop the name and extras (if present)
- strip_subdir (bool) – Whether to drop the subdirectory (if present)
Returns: The reconstructed string representing the URI
Return type: - escape_password – Whether to replace password with
-
unsafe_string
¶
-
uri_escape
¶
-
url_without_fragment
¶
-
url_without_fragment_or_ref
¶
-
url_without_ref
¶
-
username
= None¶ The username if provided, parsed from user:password@hostname
-
requirementslib.utils module¶
-
exception
requirementslib.utils.
PathAccessError
(exc, seg, path)[source]¶ Bases:
KeyError
,IndexError
,TypeError
An amalgamation of KeyError, IndexError, and TypeError, representing what can occur when looking up a path in a nested object.
-
requirementslib.utils.
get_path
(root, path, default=<object object>)[source]¶ Retrieve a value from a nested object via a tuple representing the lookup path. >>> root = {‘a’: {‘b’: {‘c’: [[1], [2], [3]]}}} >>> get_path(root, (‘a’, ‘b’, ‘c’, 2, 0)) 3 The path format is intentionally consistent with that of
remap()
. One of get_path’s chief aims is improved error messaging. EAFP is great, but the error messages are not. For instance,root['a']['b']['c'][2][1]
gives backIndexError: list index out of range
What went out of range where? get_path currently raisesPathAccessError: could not access 2 from path ('a', 'b', 'c', 2, 1), got error: IndexError('list index out of range',)
, a subclass of IndexError and KeyError. You can also pass a default that covers the entire operation, should the lookup fail at any level. Args:- root: The target nesting of dictionaries, lists, or other
- objects supporting
__getitem__
. - path (tuple): A list of strings and integers to be successively
- looked up within root.
- default: The value to be returned should any
PathAccessError
exceptions be raised.
-
requirementslib.utils.
is_installable_file
(path)[source]¶ Determine if a path can potentially be installed
-
requirementslib.utils.
is_vcs
(pipfile_entry)[source]¶ Determine if dictionary entry from Pipfile is for a vcs dependency.
-
requirementslib.utils.
remap
(root, visit=<function default_visit>, enter=<function dict_path_enter>, exit=<function dict_path_exit>, **kwargs)[source]¶ The remap (“recursive map”) function is used to traverse and transform nested structures. Lists, tuples, sets, and dictionaries are just a few of the data structures nested into heterogenous tree-like structures that are so common in programming. Unfortunately, Python’s built-in ways to manipulate collections are almost all flat. List comprehensions may be fast and succinct, but they do not recurse, making it tedious to apply quick changes or complex transforms to real-world data. remap goes where list comprehensions cannot. Here’s an example of removing all Nones from some data: >>> from pprint import pprint >>> reviews = {‘Star Trek’: {‘TNG’: 10, ‘DS9’: 8.5, ‘ENT’: None}, … ‘Babylon 5’: 6, ‘Dr. Who’: None} >>> pprint(remap(reviews, lambda p, k, v: v is not None)) {‘Babylon 5’: 6, ‘Star Trek’: {‘DS9’: 8.5, ‘TNG’: 10}} Notice how both Nones have been removed despite the nesting in the dictionary. Not bad for a one-liner, and that’s just the beginning. See `this remap cookbook`_ for more delicious recipes. .. _this remap cookbook: http://sedimental.org/remap.html remap takes four main arguments: the object to traverse and three optional callables which determine how the remapped object will be created. Args:
- root: The target object to traverse. By default, remap
- supports iterables like
list
,tuple
,dict
, andset
, but any object traversable by enter will work. - visit (callable): This function is called on every item in
- root. It must accept three positional arguments, path,
key, and value. path is simply a tuple of parents’
keys. visit should return the new key-value pair. It may
also return
True
as shorthand to keep the old item unmodified, orFalse
to drop the item from the new structure. visit is called after enter, on the new parent. The visit function is called for every item in root, including duplicate items. For traversable values, it is called on the new parent object, after all its children have been visited. The default visit behavior simply returns the key-value pair unmodified. - enter (callable): This function controls which items in root
- are traversed. It accepts the same arguments as visit: the
path, the key, and the value of the current item. It returns a
pair of the blank new parent, and an iterator over the items
which should be visited. If
False
is returned instead of an iterator, the value will not be traversed. The enter function is only called once per unique value. The default enter behavior support mappings, sequences, and sets. Strings and all other iterables will not be traversed. - exit (callable): This function determines how to handle items
- once they have been visited. It gets the same three
arguments as the other functions – path, key, value
– plus two more: the blank new parent object returned
from enter, and a list of the new items, as remapped by
visit.
Like enter, the exit function is only called once per
unique value. The default exit behavior is to simply add
all new items to the new parent, e.g., using
list.extend()
anddict.update()
to add to the new parent. Immutable objects, such as atuple
ornamedtuple
, must be recreated from scratch, but use the same type as the new parent passed back from the enter function. - reraise_visit (bool): A pragmatic convenience for the visit
- callable. When set to
False
, remap ignores any errors raised by the visit callback. Items causing exceptions are kept. See examples for more details.
remap is designed to cover the majority of cases with just the visit callable. While passing in multiple callables is very empowering, remap is designed so very few cases should require passing more than one function. When passing enter and exit, it’s common and easiest to build on the default behavior. Simply add
from boltons.iterutils import default_enter
(ordefault_exit
), and have your enter/exit function call the default behavior before or after your custom logic. See `this example`_. Duplicate and self-referential objects (aka reference loops) are automatically handled internally, `as shown here`_. .. _this example: http://sedimental.org/remap.html#sort_all_lists .. _as shown here: http://sedimental.org/remap.html#corner_cases