英文:
Python requirements.txt restrict dependency to be installed only on atom processors
问题
I'm using TensorFlow under inside an x86_64 environment, but the processor is an Intel Atom processor. This processor lacks the AVX processor extension and since the pre-built wheels for TensorFlow are compiled with the AVX extension TensorFlow does not work and exits. Hence I had to build my own wheel and I host it on GitHub as a released file.
The problem I have is to download this pre-built wheel only on an Atom based processor. I was able to achieve this previously using a setup.py
file where this can be easily detected, but I have migrated to pyproject.toml
which is very poor when it comes to customization and scripted installation support.
Is there anything similar in addition to platform_machine=='x86_64'
which checks for the processor type? Or has the migration to pyproject.toml
killed here my flexibility?
The current requirements.txt
is:
confluent-kafka @ https://github.com/HandsFreeGadgets/python-wheels/releases/download/v0.1/confluent_kafka-1.9.2-cp38-cp38-linux_aarch64.whl ; platform_machine=='aarch64'
tensorflow @ https://github.com/HandsFreeGadgets/python-wheels/releases/download/v0.1/tensorflow-2.8.4-cp38-cp38-linux_aarch64.whl ; platform_machine=='aarch64'
tensorflow-addons @ https://github.com/HandsFreeGadgets/python-wheels/releases/download/v0.1/tensorflow_addons-0.17.1-cp38-cp38-linux_aarch64.whl ; platform_machine=='aarch64'
tensorflow-text @ https://github.com/HandsFreeGadgets/python-wheels/releases/download/v0.1/tensorflow_text-2.8.2-cp38-cp38-linux_aarch64.whl ; platform_machine=='aarch64'
rasa==3.4.2
SQLAlchemy==1.4.45
phonetics==1.0.5
de-core-news-md @ https://github.com/explosion/spacy-models/releases/download/de_core_news_md-3.4.0/de_core_news_md-3.4.0-py3-none-any.whl
For platform_machine=='aarch64'
I need something similar for x86_64 but only executed on Atom processor environments.
The old setup.py
was:
import platform
import subprocess
import os
from setuptools import setup
def get_requirements():
requirements = []
if platform.machine() == 'x86_64':
command = "cat /proc/cpuinfo"
all_info = subprocess.check_output(command, shell=True).strip()
# AVX extension is the missing important information
if b'avx' not in all_info or ("NO_AVX" in os.environ and os.environ['NO_AVX']):
requirements.append(f'tensorflow @ file://localhost/'+os.getcwd()+'/pip-wheels/amd64/tensorflow-2.3.2-cp38-cp38-linux_x86_64.whl')
elif platform.machine() == 'aarch64':
...
requirements.append('rasa==3.3.3')
requirements.append('SQLAlchemy==1.4.45')
requirements.append('phonetics==1.0.5')
requirements.append('de-core-news-md @ https://github.com/explosion/spacy-models/releases/download/de_core_news_md-3.4.0/de_core_news_md-3.4.0-py3-none-any.whl')
return requirements
setup(
...
install_requires=get_requirements(),
...
)
The line if b'avx' not in all_info or ("NO_AVX" in os.environ and os.environ['NO_AVX'])
does the necessary differentiation.
If a pyproject.toml
approach is not for my needs, what is recommended for Python with more installation power which is not marked as legacy? Maybe there is something similar for Python what is Gradle for building projects in the Java world, which was introduced to overcome the XML
limitations and providing a complete scripting language which I'm not aware of?
英文:
I'm using TensorFlow under inside an x64_64 environment, but the processor is an Intel Atom processor. This processor lacks the AVX processor extension and since the pre-built wheels for TensorFLow are complied with the AVX extension TensorFLow does not work and exits. Hence I had to build my own wheel and I host it on GitHub as a released file.
The problem I have is to download this pre-built wheel only in an Atom based processor. I was able to achieve this previously using a setup.py
file where this can be easily detected, but I have migrated to pyproject.toml
which is very poor when it comes to customization and scripted installation support.
Is there anything similar in addition to platform_machine=='x86_64'
which checks for the processor type? Or has the migration to pyproject.toml
killed here my flexibility?
The current requirements.txt
is:
confluent-kafka @ https://github.com/HandsFreeGadgets/python-wheels/releases/download/v0.1/confluent_kafka-1.9.2-cp38-cp38-linux_aarch64.whl ; platform_machine=='aarch64'
tensorflow @ https://github.com/HandsFreeGadgets/python-wheels/releases/download/v0.1/tensorflow-2.8.4-cp38-cp38-linux_aarch64.whl ; platform_machine=='aarch64'
tensorflow-addons @ https://github.com/HandsFreeGadgets/python-wheels/releases/download/v0.1/tensorflow_addons-0.17.1-cp38-cp38-linux_aarch64.whl ; platform_machine=='aarch64'
tensorflow-text @ https://github.com/HandsFreeGadgets/python-wheels/releases/download/v0.1/tensorflow_text-2.8.2-cp38-cp38-linux_aarch64.whl ; platform_machine=='aarch64'
rasa==3.4.2
SQLAlchemy==1.4.45
phonetics==1.0.5
de-core-news-md @ https://github.com/explosion/spacy-models/releases/download/de_core_news_md-3.4.0/de_core_news_md-3.4.0-py3-none-any.whl
For platform_machine=='aarch64'
I need something similar for x86_64 but only executed on Atom processor environments.
The old setup.py
was:
import platform
import subprocess
import os
from setuptools import setup
def get_requirements():
requirements = []
if platform.machine() == 'x86_64':
command = "cat /proc/cpuinfo"
all_info = subprocess.check_output(command, shell=True).strip()
# AVX extension is the missing important information
if b'avx' not in all_info or ("NO_AVX" in os.environ and os.environ['NO_AVX']):
requirements.append(f'tensorflow @ file://localhost/'+os.getcwd()+'/pip-wheels/amd64/tensorflow-2.3.2-cp38-cp38-linux_x86_64.whl')
elif platform.machine() == 'aarch64':
...
requirements.append('rasa==3.3.3')
requirements.append('SQLAlchemy==1.4.45')
requirements.append('phonetics==1.0.5')
requirements.append('de-core-news-md @ https://github.com/explosion/spacy-models/releases/download/de_core_news_md-3.4.0/de_core_news_md-3.4.0-py3-none-any.whl')
return requirements
setup(
...
install_requires=get_requirements(),
...
)
The line if b'avx' not in all_info or ("NO_AVX" in os.environ and os.environ['NO_AVX'])
does the necessary differentiation.
If a pyproject.toml
approach is not for my needs, what is recommended for Python with more installation power which is not marked as legacy? Maybe there is something similar for Python what is Gradle for building projects in the Java world, which was introduced to overcome the XML
limitations and providing a complete scripting language which I'm not aware of?
答案1
得分: 2
以下是翻译好的部分:
My recommendation would be to migrate pyproject.toml
as intended. I would declare dependencies such as tensorflow
according to the standard specification for dependencies but I would not use any direct references at all.
Then I would create some requirements.txt
files in which I would list the dependencies that need special treatment (no need to list all dependencies), for example those that require a direct reference (and/or a pinned version). I would probably create one requirements file per platform, for example I would create a requirements-atom.txt
.
As far as I know it should be possible to instruct pip to install from a remote requirements file via its URL. Something like this:
python -m pip install --requirement 'https://server.tld/path/requirements-atom.txt'
If you need to create multiple requirements.txt
files with common parts, then probably a tool like pip-tools can help.
Maybe something like the following (untested):
requirements-common.in
# 应用程序(或主项目)
MyApplication @ git+https://github.com/HandsFreeGadgets/MyApplication.git
# 共同依赖项
CommonLibrary
AnotherCommonLibrary==1.2.3
requirements-atom.in
:
--requirement requirements-common.in
# Atom CPU specific
tensorflow @ https://github.com/HandsFreeGadgets/tensorflow-atom/releases/download/v0.1/tensorflow-2.8.4-cp38-cp38-linux_aarch64.whl ; platform_machine=='aarch64'
pip-compile requirements-atom.in > requirements-atom.txt
英文:
My recommendation would be to migrate pyproject.toml
as intended. I would declare dependencies such as tensorflow
according to the standard specification for dependencies but I would not use any direct references at all.
Then I would create some requirements.txt
files in which I would list the dependencies that need special treatment (no need to list all dependencies), for example those that require a direct reference (and/or a pinned version). I would probably create one requirements file per platform, for example I would create a requirements-atom.txt
.
As far as I know it should be possible to instruct pip to install from a remote requirements file via its URL. Something like this:
python -m pip install --requirement 'https://server.tld/path/requirements-atom.txt'
If you need to create multiple requirements.txt
files with common parts, then probably a tool like pip-tools
can help.
Maybe something like the following (untested):
requirements-common.in
# Application (or main project)
MyApplication @ git+https://github.com/HandsFreeGadgets/MyApplication.git
# Common dependencies
CommonLibrary
AnotherCommonLibrary==1.2.3
requirements-atom.in
:
--requirement requirements-common.in
# Atom CPU specific
tensorflow @ https://github.com/HandsFreeGadgets/tensorflow-atom/releases/download/v0.1/tensorflow-2.8.4-cp38-cp38-linux_aarch64.whl ; platform_machine=='aarch64'
pip-compile requirements-atom.in > requirements-atom.txt
</details>
通过集体智慧和协作来改善编程学习和解决问题的方式。致力于成为全球开发者共同参与的知识库,让每个人都能够通过互相帮助和分享经验来进步。
评论