logging_strict

joined 1 year ago
[–] logging_strict@programming.dev 1 points 1 day ago (1 children)

ok fine lets talk about this Linux distro

Don't want to be a package manager database on my off hours. Why is having users manage every transitive dependency a good design?

I'm asking i really don't understand the merits of adopting this heavy burden

[–] logging_strict@programming.dev 1 points 1 day ago (2 children)

thank you for the history lesson

trio and anyio fixes that

One phrase not found in the article, colored functions

[–] logging_strict@programming.dev 2 points 2 days ago (1 children)

This is a Linux post. Has nothing to do with Python

[–] logging_strict@programming.dev 1 points 3 days ago* (last edited 3 days ago)

Free threaded came on the scene and packages slowly added support. So there is a will to gravitate towards and adopt what works. Albeit gradually.

I prefer typing_extensions over typing and collections.abc

With typing_extensions, new features are always backported. With Python features, have to continuously upgrade Python. Whatever you upgrade to is already guaranteed to be very temporary. It's much easier to upgrade a package.

For the same reasoning would prefer Trio over asyncio.TaskGroup.

Which leads to the question Trio vs asyncio.TaskGroup?

asyncio.TaskGroup is a py311 feature with context kwarg added in 3.13. The documentation is very terse and i'm unsure what guarantees it has, besides strong. Missed opportunity. Could have used the adjective, Mickey mouse. Both are essentially the same, useless.

Having to upgrade to 3.13 is what i call failure to backport or simply, failure or that's what failure looks like.

Give a free pass to free threading, but everything else, no!

Having to upgrade Python to have access to sane structured concurrency is silly. Have the exact same complaints about Package Managers.

What to know about f strings

!r is a thing

!s is a thing

There is some syntax for formatting a float which will completely be forgotten that'll have to be looked up.

There is nothing else worth knowing.

Now lets moan and complain about something actually important. Like repos with languishing PRs, like SQLModel.

[–] logging_strict@programming.dev 1 points 1 week ago* (last edited 1 week ago)

Upvote for the sanity check.

As the OP mentioned, this is a proposed/draft feature that may or may not ever happen.

With these kinda posts, should start a betting pool. To put money down on whether this feature sees the light of day within an agreed upon fixed time frame.

[–] logging_strict@programming.dev 3 points 2 weeks ago* (last edited 2 weeks ago)

Why the commercial license for pngquant? Maybe rewriting pngcrush IP and slapping a commercial license on it is copyright infringement. This is my impression of Rust. Take others IP, rewrite it in Rust, poof copyright magically transferred. The C99 version how much of that is from prior art?

Lets just ignore prior art and associated license terms

pngquant commercial license

written by Kornel Lesiński

ImageOptim Ltd. registered in England and Wales under company number 10288649 whose registered office is at International House, 142 Cromwell Road, London, England, SW7 4EF

First commit Sep 17th, 2009

pngcrush license

Copyright (C) 1998-2002, 2006-2016 Glenn Randers-Pehrson

glennrp at users.sf.net

Portions copyright (C) 2005 Greg Roelofs

 

Python static typing Ludwick hunger games

[–] logging_strict@programming.dev -3 points 3 weeks ago* (last edited 3 weeks ago)

i'm a fan of ladies with complete test coverage

but i'm ok with those who are a fan of type inference.

More ladies for me

[–] logging_strict@programming.dev 0 points 3 weeks ago* (last edited 3 weeks ago)

Oh btw there are three choices, not two.

  1. stdlib dataclasses

  2. pydantic dataclasses

  3. SQLModel mixin

How to use Mixins

[–] logging_strict@programming.dev -3 points 4 weeks ago* (last edited 4 weeks ago)

pydantic underneath (pydantic-base) is written in Rust. fastapi is fast cuz of pydantic. fastapi is extremely popular. Cuz it's right there in the word, fast. No matter how crap the usability is, the word fast will always win.

If fastapi is fast then whatever is not fastapi is slow. And there is no convincing anyone otherwise so lets not even try.

Therefore lets do fast. Cuz we already agreed slow would be bad.

normal dataclasses is not fast and therefore it's bad. If it had a better marketing team this would be a different conversation.

SQLModel combines pydantic and SQLAlchemy.

At first i feel in love with the SQLModel docs. Then realized the eye wateringly beautiful docs are missing vital details, such as how to:

  1. create a Base without also creating a Model table

  2. overload __tablename__ algo from the awful default cls.__name__.lower()

  3. support multiple databases each containing the same named table

#2 is particularly nasty. SQLModel.__new__ implementation consists of multiple metaclasses. So subclasses always inherit that worthless __tablename__ implementation. And SQLAlchemy applies three decorators, so figuring out the right witchcraft to create the Descriptor is near impossible. pydantic doesn't support overriding __tablename__

Then i came along

After days of, lets be honest, hair loss and bouts of heavy drinking, posted the answer here.

Required familiarity with pydantic, sqlalchemy, and SQLModel.

[–] logging_strict@programming.dev 1 points 1 month ago (1 children)

There is an expression, Linux isn't free it costs you your time. Which might be a counter argument against always using only what is built in.

I'm super guilty of reinventing the wheel. But writing overly verbose code isn't fun either. Never seem to get very far.

[–] logging_strict@programming.dev 2 points 1 month ago* (last edited 1 month ago) (3 children)

people are forced to install dependencies

This ^^.

If possible, Python dependency management is a burden would prefer to avoid. Until can't, then be skilled at it!

disclosure: i use/wrote wreck for Python dependency management.

Compiled languages should really live within containers. At all cost, would like to avoid time consuming system updates! I can no longer install C programs cuz on OS partition ran out of hard disk space. Whereas Python packages can be installed on data storage partitions.

for Python, I usually deliver the script as a single .py file I'm sure you are already aware of this. So forgive me if this is just being Captain Obvious.

Even if the deliverable is a single .py file, there is support for specifying dependencies within module level comment block. (i forget the PEP #).

I don’t like that (unless its a shell script, but that is by its nature a dependency hell) You and i could bond over a hatefest on shell scripts, but lets leave this as outside the discussion scope

And your argument As the complexity of a .py script grows, very quickly, comes to a point the deliverable becoming a Python package. With the exceptions being projects which are: external language, low level, or simple. This .py script nonsense does not scale and is exceedingly rare to encounter. May be an indication of a old/dated or unmaintained project.

From a random venv, installed scripts:

_black_version.py
appdirs.py
cfgv.py
distutils-precedence.pth
mccabe.py
mypy_extensions.py
nodeenv.py
packaging_legacy_version.py
pip_requirements_parser.py
py.py
pycodestyle.py
pyi.py
six.py
typing_extensions.py
2
submitted 5 months ago* (last edited 5 months ago) by logging_strict@programming.dev to c/python@programming.dev
 

Market research

This post is only about dependency management, not package management, not build backends.

You know about these:

  • uv

  • poetry

  • pipenv

You are probably not familiar with:

  • pip-compile-multi

    (toposort, pip-tools)

You are defintely unfamiliar with:

  • wreck

    (pip-tools, pip-requirements-parser)

pip-compile-multi creates lock files. Has no concept of unlock files.

wreck produces both lock and unlock files. venv aware.

Both sync dependencies across requirement files

Both act only upon requirements files, not venv(s)

Up to speed with wreck

You are familiar with .in and .txt requirements files.

.txt is split out into .lock and .unlock. The later is for packages which are not apps.

Create .in files that are interlinked with -r and -c. No editable builds. No urls.

(If this is a deal breaker feel free to submit a PR)

pins files

pins-*.in are for common constraints. The huge advantage here is to document why?

Without the documentation even the devs has no idea whether or not the constraint is still required.

pins-*.in file are split up to tackle one issue. The beauty is the issue must be documented with enough details to bring yourself up to speed.

Explain the origin of the issue in terms a 6 year old can understand.

Configuration

python -m pip install wreck

This is logging-strict pyproject.toml


[tool.wreck]
create_pins_unlock = false

[[tool.wreck.venvs]]
venv_base_path = '.venv'
reqs = [
    'requirements/dev',
    'requirements/kit',
    'requirements/pip',
    'requirements/pip-tools',
    'requirements/prod',
    'requirements/manage',
    'requirements/mypy',
    'requirements/tox',
]

[[tool.wreck.venvs]]
venv_base_path = '.doc/.venv'
reqs = [
    'docs/requirements',
]

dynamic = [
    "optional-dependencies",
    "dependencies",
    "version",
]

[tool.setuptools.dynamic]
dependencies = { file = ["requirements/prod.unlock"] }
optional-dependencies.pip = { file = ["requirements/pip.lock"] }
optional-dependencies.pip_tools = { file = ["requirements/pip-tools.lock"] }
optional-dependencies.dev = { file = ["requirements/dev.lock"] }
optional-dependencies.manage = { file = ["requirements/manage.lock"] }
optional-dependencies.docs = { file = ["docs/requirements.lock"] }
version = {attr = "logging_strict._version.__version__"}

Look how short and simple that is.

The only thing you have to unlearn is being so timid.

More venvs. More constraints and requirements complexity.

Do it

mkdir -p .venv || :;
pyenv version > .venv/python-version
python -m venv .venv

mkdir -p .doc || :;
echo "3.10.14" > .doc/python-version
cd .doc && python -m venv .venv; cd - &>/dev/null

. .venv/bin/activate
# python -m pip install wreck
reqs fix --venv-relpath='.venv'

There will be no avoidable resolution conflicts.

Preferable to do this within tox-reqs.ini

Details

TOML file format expects paths to be single quoted. The paths are relative without the last file suffix.

If pyproject.toml not in the cwd, --path='path to pyproject.toml'

create_pins_unlock = false tells wreck to not produce .unlock files for pins-*.in files.

DANGER

This is not for a faint of heart. If you can avoid it. This is for the folks who often say, Oh really, hold my beer!

For pins that span venv, add the file suffix .shared

e.g. pins-typing.shared.in

wreck deals with one venv at a time. Files that span venv have to be dealt with manually and carefully.

Issues

  1. no support for editable builds

  2. no url support

  3. no hashs

  4. your eyes will tire and brains will splatter on the wall, from all the eye rolling after sifting thru endless posts on uv and poetry and none about pip-compile-multi or wreck

  5. Some folks love having all dependency managed within pyproject.toml These folks are deranged and its impossible to convince them otherwise. pyproject.toml is a config file, not a database. It should be read only.

  6. a docs link on pypi.org is 404. Luckily there are two docs links. Should really just fix that, but it's left like that to see if anyone notices. No one did.

4
submitted 5 months ago* (last edited 5 months ago) by logging_strict@programming.dev to c/python@programming.dev
 

Finally got around to creating a gh profile page

The design is to give activity insights on:

  • what Issues/PRs working on

  • future issues/PRs

  • for fun, show off package mascots

All out of ideas. Any suggestions? How did you improve your github profile?

13
Whats in a Python tarball (programming.dev)
submitted 6 months ago* (last edited 6 months ago) by logging_strict@programming.dev to c/python@programming.dev
 

From helping other projects have run across a fundamental issue which web searches have not given appropriate answers.

What should go in a tarball and what should not?

Is it only the build files, python code, and package data and nothing else?

Should it include tests/ folder?

Should it include development and configuration files?

Have seven published packages which include almost all the files and folders. Including:

.gitignore,

.gitattributes,

.github folder tree,

docs/,

tests/,

Makefile,

all config files,

all tox files,

pre-commit config file

My thinking is that the tarball should have everything needed to maintain the package, but this belief has been challenged. That the tarball is not appropriate for that.

Thoughts?

 

PEP 735 what is it's goal? Does it solve our dependency hell issue?

A deep dive and out comes this limitation

The mutual compatibility of Dependency Groups is not guaranteed.

-- https://peps.python.org/pep-0735/#lockfile-generation

Huh?! Why not?

mutual compatibility or go pound sand!

pip install -r requirements/dev.lock
pip install -r requirements/kit.lock -r requirements/manage.lock

The above code, purposefully, does not afford pip a fighting chance. If there are incompatibilities, it'll come out when trying randomized combinations.

Without a means to test for and guarantee mutual compatibility, end users will always find themselves in dependency hell.

Any combination of requirement files (or dependency groups), intended for the same venv, MUST always work!

What if this is scaled further, instead of one package, a chain of packages?!

 

In a requirements-*.in file, at the top of the file, are lines with -c and -r flags followed by a requirements-*.in file. Uses relative paths (ignoring URLs).

Say have docs/requirements-pip-tools.in

-r ../requirements/requirements-prod.in
-c ../requirements/requirements-pins-base.in
-c ../requirements/requirements-pins-cffi.in

...

The intent is compiling this would produce docs/requirements-pip-tool.txt

But there is confusion as to which flag to use. It's non-obvious.

constraint

Subset of requirements features. Intended to restrict package versions. Does not necessarily (might not) install the package!

Does not support:

  • editable mode (-e)

  • extras (e.g. coverage[toml])

Personal preference

  • always organize requirements files in folder(s)

  • don't prefix requirements files with requirements-, just doing it here

  • DRY principle applies; split out constraints which are shared.

view more: next ›