[DISCUSSION] use poetry to install all the python dependencies, inluding sphinx/docs

7 views
Skip to first unread message

Semyon Sinchenko

unread,
Feb 21, 2024, 7:11:29 PM2/21/24
to gra...@googlegroups.com
Hi all!

Inside graphar-pyspark there is a full poetry-based python project. All
the dependencies, related to the documentation (including all the deps
from the docs/requirements.txt are separated to the poetry-group docs).
Doc-dependencies may be installed via poetry (poetry install --
with=docs). Also, there is already an alternative comman inside
docs/Makefile: make html-poetry. It works fine for me in my local
setup. I suggest to switch our docs CI from pip install to poetry.

The main reason is that because sphinx is python-native tool, to
generate python API docs it should be called from environment where the
python project is installed. Again, I made checks and it works fine in
my local setup (documentation was generated successfully for all the
parts, not only pyspark).

May someone make a look or double check that we are ready to switch to
poetry?

To reproduce it locally you need poetry installed
(https://python-poetry.org/docs/#installation) and run the following:

Inside graphar-pyspark:
1. poetry install --with=docs,spark

Inside docs:
1. make pyspark-apicod
2. make html-poetry

If everything works fine for all other projects (Java, Spark, Cpp), I
can create a PR that change the docs CI.

Best regards,
Sem

Reply all
Reply to author
Forward
0 new messages