2024-05-13 15:59:34
Teaching #Python folks the joys of clear and explicit object type definitions. An interesting experience...
#Programming
These is one of these days when it occurs to you: "hey, packages using #Python and #RustLang, may have *both* Python and Cargo-level tests." And then you spend a lot of time going over all Rust-enabled dev-python/* packages and adding `cargo_src_test` where appropriate.
As it turns out, many of them did. Most of these don't actually link to libpython, so I suppose it's fine to test them once. Pydantic-core does, so I test per-impl (but also can't test on PyPy). Cryptography has Rust-level tests that don't even build (they fail at linking).
#Gentoo
#CPython 3.13 has some new fun flags for #Gentoo ricers.
USE=jit − enable to get Just-in-Time compiler and make stuff faster. Note that you're also going to see random segfaults, etc.
USE=gil − disable to get rid of Global Interpreter Lock and get a freethreading #Python. Expect some random breakage, race conditions, etc.
BeeTeeDubs, I'm going live in ten with a #LiveCoding #Python Tutorial drawing flags with turtle graphics.
Really it's about abstraction and decomposition.
#ComputerScience
I’m doing #python today...
RenPy is such an interesting project.
#programming #python
I’m doing #python today...
Breaking #Packages in #Python: an exposé of the nooks and crannies of Python’s modules and packages
https://dagster.io/bl…
Come for wisdom on tox & Nox – stay for unrelated gems that will improve your life! #python #video
https://youtu.be/ImBvrDvK-1U
#Python
As I'm working on my #networking diagnostic script which relies on my OS abstraction layer for system information I built a number of years back, I decided it would make more sense to formally port this API up to pypi as a standalone library. (It doesn…
RenPy is such an interesting project.
#programming #python
I think I have finally™️ (for the third or so time) found myself a solution for :python: #Python development on :nixos: #NixOS that allows me to just work with #pythonPoetry et. al. as on other distros.
…Here's my newest optimization idea for #Gentoo #Python: for pure Python packages and packages using the stable #CPython API, let's reuse previously built wheels if they're compatible instead of building them separately for each Python implementation.
This is mostly a major gain for the lot of packages using #setuptools when you're using multiple PYTHON_TARGETS, since calling into that build system has a significant cost. However, it also saves some actual compiling in the packages using the stable API (particularly, Rust packages).
https://github.com/gentoo/gentoo/pull/36672
https://bugs.gentoo.org/931689
It's really a bummer that #Python enumerate doesn't have an option to tell it to go backwards.
Also, it took me way too long to figure out that code like this doesn't do what I wanted it to do. (The index doesn't match the location in MyString.)
MyString = "Hello"
for index , Char in enumerate (MyString[::-1]):
print (index, Char)
0 o
1 …
Me: #Python couldn't possibly still have a horrid experience on installing packages.
Me, ten minutes later: Actually, nope, still horrid. How do people function in this environment?!?
h/t to @… for being correct here.
For the record,
How about tests that fail in a completely nondescript way if your hostname is 12 characters long? I mean, the bash prompt gets into output and breaks matching in one environment, and doesn't in an environment with a shorter hostname.
As it turns out, it was caused by putting a wrong path to bashrc override while adding zsh support to #ArgComplete. I guess most people didn't get a long enough bash prompt to hit the failure.
#Gentoo #Python
Time is hard #python
I'm close to having @… sorted out. With a new API comes slightly different data points. #python
I have noticed that some of my Python builds are emitting "Sorry" error messages rather than "Error" message for things like indentation flaws.
"Sorry" is a sorry way to flag an error.
#python
It's been a while since I worked on #Python, but I remember how great was Ruff as a linter replacement.
It seems that the same team behind Ruff is now releasing a pkg manager named `uv`, as a substitute for pip, poetry, pdm... and it looks great (well, I think the name is terrible, but we can live with that):
Even though I despise Python for wasting CPU cycles, I have rarely seen a CPU-limted system that was actually busy doing CPU things.
Most of the time things are only slow because GPU things like rendering and video decoding have to be done on the CPU.
#Python #Programming
Wow, I just accidentally discovered that #matplotlib allows some limited latex in label strings. At least $x$ works!
#python
Video tutorials for modern ideas and open source tools. #python
I'm close to having @… sorted out. With a new API comes slightly different data points. #python
Today I saw #Python code from hell where the author wrote a function to wrap a call that might return an empty list. Apparently they wanted to return False in that case to use it in an if statement - not knowing that Python treats empty lists as falsy by default. But it gets better. The code was broken and it was never able to return False. Instead it returned the empty list but the code worked an…
Can one pre-build a buildFHSUserEnv in configuration.nix and then have a command like 'fhs' that yanks you into a shell with all your environment.systemPackages in an FHS layout?
All my tries have given me either infinite recursion errors or other problems.
Seems like a way out of the #Python development misery on
New on blog: "the story of #distutils build directory in #Gentoo".
"""
The #Python distutils build system, as well as #setuptools (that it was later merged into), used a two-stage build: first, a build command would prepare a built package version (usually just copy the .py files, sometimes compile Python extensions) into a build directory, then an install command would copy them to the live filesystem, or a staging directory. Curious enough, distutils were an early adopter of out-of-source builds — when used right (which often enough wasn’t the case), no writes would occur in the source directory and all modifications would be done directly in the build directory.
Today, in the #PEP517 era, two-stage builds aren't really relevant anymore. Build systems were turned into black boxes that spew wheels. However, setuptools still internally uses the two-stage build and the build directory, and therefore it still remains relevant to Gentoo eclasses. In this post, I'd like to shortly tell how we dealt with it over the years.
"""
https://blogs.gentoo.org/mgorny/2024/03/13/the-story-of-distutils-build-directory-in-gentoo/
I just wrote a script in #python that is using HuggingFaceH4 (hosted internally on a server) to make a proposal for the Severity of my Bugs with an explanation why it does that.
Tooked me with research and setup about 3 hours and it is pretty good. for the first 20 bugs, i would personally rate differently - but for one out of those the genAI is maybe more right than i am :)
Cytując siebie (i tłumacząc):
"""
Szczerze mówiąc, uważam, że największym problemem jest to, że dystrybucja oprogramowania w Pythonie jest nieskończenie skomplikowana i nieintuicyjna, co oznacza, że każda osoba, która chce się tym zająć z którejkolwiek strony, ku swojemu zaskoczeniu odkryje bardzo wysoki próg wejścia. #Gentoo
This video is about #Python, but the principles apply to any language. Or with appropriate language specific translation.
https://youtu.be/wf-BqAjZb8M
I've done some benchmarks to demonstrate how different versions of #Gentoo parallel #Python extension builds (using #setuptools) #PEP517 code in distutils-r1.eclass improved over the years. I've even copied the numbers into a gnumeric spreadsheet. Now I just need to find energy to make a nice chart out of that (probably using PGF/Tikz).
Some numbers for a rough idea:
Cython 3.0.9 (package with a few C extensions):
- serial PEP517 build: 46.7 s
- parallel build / build_ext PEP517: 20.8 s 2.7 s
- parallel PEP517 via DIST_EXTRA_CONFIG: 22.8 s
django 5.0.3 (moderate pure .py package):
- PEP517 build: 5.4 s
- unnecessary build PEP517: 3.1 s 5.3 s
- unnecessary build_ext PEP517: 0.6 s 5.4 s
Can one pre-build a buildFHSUserEnv in configuration.nix and then have a command like 'fhs' that yanks you into a shell with all your environment.systemPackages in an FHS layout?
All my tries have given me either infinite recursion errors or other problems.
Seems like a way out of the #Python development misery on
Second day of #PyDataBerlin, hearing about measuring tree height with satellite imagery
#python #PyData
I'm surprised to see a computer science topic become the talk of the Whitehouse, but here we are.
The memory safety features of #Python are one of the many things I like about it.
What are the memory safe features of Python? A few that spring to mind: automatic bounds-checking, can't use undefined object references, & no memory pointer arithmetic. IMHO, those features don'…
This video is about #Python, but the principles apply to any language. Or with appropriate language specific translation.
https://youtu.be/wf-BqAjZb8M
Today I saw #Python code from hell where the author wrote a function to wrap a call that might return an empty list. Apparently they wanted to return False in that case to use it in an if statement - not knowing that Python treats empty lists as falsy by default. But it gets better. The code was broken and it was never able to return False. Instead it returned the empty list but the code worked an…
What would cause :git: #git to shell out at 'git -C repo fetch --all'?
I call git via #Python's subprocess module (no, not with shell=True) and it apparently shells out to my default :fish_shell: #fishShell
Katherine Jarmul at #pydataberlin: "Sure, humans learn things too, but we don't do it at scale, and surely we don't do it word by word"
THIS
#python #ml
On 2024-03-19, two vulnerabilities were announced on #Python #security mailing list: "quoted zip-bomb" and "TemporaryDirectory symlink dereference during cleanup". Both were announced to affect all current #CPython releases.
The same day, security releases were made for Python 3.10, 3.9 and 3.8 branches. So far, so good. However, I found it surprising that there were no releases being made for 3.11 or 3.12.
On 2024-04-02, Python 3.11.9 was tagged. Initially, the signature on source tarball didn't verify. Today, it does verify, but the release still doesn't seem to have been announced. However, what I found the most surprising is the lack of fixes for the security issues announced before! Was the release borked?
So I've checked in more detail… and it turned out that both issues were already fixed in 3.11.8 (and 3.12.2), so the security announcement was wrong. Sigh.
That said, #PyPy is still affected.
https://mail.python.org/archives/list/security-announce@python.org/thread/XELNUX2L3IOHBTFU7RQHCY6OUVEWZ2FG/
https://mail.python.org/archives/list/security-announce@python.org/thread/Q5C6ATFC67K53XFV4KE45325S7NS62LD/
https://discuss.python.org/t/python-3-10-14-3-9-19-and-3-8-19-is-now-available/48993
https://bugs.gentoo.org/927299
Here at #distribits unconference I quickly demonstrate @…'s #textual framework for website-like
Python Question: Is there an efficient way to see if every element in an iterable is equal to a user-specified value?
The best idea I've seen so far is to use a list comprehension to check for equality, and then use the all function, like this:
MyList = [1,1,1,1,1,1,1]
all((x==1 for x in MyList))
That's readable and probably not bad performance. Is there something better?
#Python
Give a warm welcome to @… dear #Python folks! 🇮🇹
I remember that back in the day, I've hit a #Python package that would attempt to be closed-source by shipping only a .pyc file… for Python 2.5. As you can imagine, this didn't work out — though it was a bit of a hassle to get a working decompiler back then (we didn't have a Python version older than 2.7.x in #Gentoo at the time). However, I don't really remember what it was and how it worked out in the end.
Who will be at PyData Berlin/PyCon Germany next week? ✋🏼
#python #PyConDE2024 #PyDataBerlin
If you were even wondering why #Gentoo python-exec is sitting in dev-lang/ rather than dev-python/, that's actually a curious story of a hack.
#Python script wrapping dates back to the original python.eclass. The idea was sound: you'd rename scripts installed by Python packages to "foo-2.6", "foo-3.1", and then install a new wrapper that selects one of these versions. If this may sound unnecessary, remember that scripts need to be run with a Python version that has all its dependencies installed, and that different scripts could technically be installed for different versions (especially in the early Python 3 era, when using 2to3 was not uncommon).
The first version of dev-python/python-exec built on that concept, optimizing it in a few ways. It would install a shared wrapper as a separate package and use symlinks, whereas the old wrappers would be installed separately by every package. It would introduce a C executable to remove the overhead of starting a Python interpreter to determine which interpreter to actually start. However, it did copy the idea of suffixing scripts.
In the long run, this turned out to be a bad idea. Some scripts expected specific names, and we'd ended up hacking and patching things around. So I've made python-exec 2 that used a slightly different approach, and rather than renaming scripts, it moved them into a per-interpreter directory. This not only preserved the basename but let us make things simpler — we could just override the script install directory rather than having to rename stuff!
Migration was the problem. Two different approaches meant that packages built against python-exec:0 would have to continue depending on it. Originally, I've added a `:=` dep to the eclass but this couldn't solve the problem for packages already installed, and Portage ended up removing old python-exec. It wasn't good.
After some tinkering, I've figured out that the best approach is to add both slots as a new package, have the eclass depend on it with `:=` operator going forward and have the old `dev-python/python-exec` pull both slots for compatibility. This is what gave birth to `dev-lang/python-exec`.
https://bugs.gentoo.org/489440#c20
It's a miracle! I've managed to make #SciKit Image tests work offline in #Gentoo!
That said:
1. Upstream includes most of the test data in sdist. Except they don't use it, and instead fetch it all from the GitHub repository. Sigh. (If you're upstream, please pretend you didn't see this and do not remove the data from sdist. Thanks.)
2. On top of that, they fetch more test data from an additional GitLab repository, and move and rename it around.
3. Yes, you've read right. They keep code on GitHub, but test data on GitLab.com (not self-hosted).
#Python
You know what's great about #Python stable ABI? That you can take a binary package of, say, cryptography, and it will work on CPython 3.13, even though it's been built with older CPython version.
You know what's not so great about #PyO3? That you won't be able to build this package using Python 3.13 because it's going to reject it as "too new". Even if the package in question is only using the stable ABI compatible with CPython 3.9. Sigh.
So, of course, everything on #Gentoo will be blocked, until individual packages update their dependencies to use PyO3 new enough to support 3.13.
#RustLang
Today is the day! We have finally been able to mark #NoseTests for removal from #Gentoo!
#Python
Today is the day! We have finally been able to mark #NoseTests for removal from #Gentoo!
#Python
The "#PyPI Support Specialist" job posting would be quite nice if they didn't add the last point that automatically disqualifies some people with disabilities, plus people from some countries that USA doesn't like at the moment [EDIT: and of course people who for one reason or another don't want to go to such a dangerous and hostile country]:
"Willingness to travel to annual PyCon US conference"
#ActuallyAutistic #Python
Recently I've added a cheap hack to the standard #Gentoo invocation for #PyTest to throw errors if unhandled async functions are detected. The goal was to increase our chances of finding packages with missing dependency on dev-python/pytest-asyncio (or another equivalent plugin), or packages disabling plugin autoloading and failing to load such a plugin.
Today, I've gotten a first bug report, regarding dev-python/ipython. I've grepped the sources and confirmed that the package depends on PyTest-AsyncIO, except that it pins to < 0.22. Well, we don't have one that old but let's hope it works anyway. So I've tried adding the dep, `-p asyncio`… and PyTest still apparently couldn't find the plugin. I've scratched my head and tried `PYTEST_PLUGINS` instead — still the same result. What the…?
So I've checked the git repository out, tried with older PyTest-AsyncIO, and indeed the tests worked. Tried with the newest, 0.23.6, and the same issue occurred. I've checked the git history and discovered that the version pin was added because of a buggy 0.22.0 release. However, the issue has been fixed since, the release was yanked and my problem was nothing like that.
So I've investigated more. For some reason, #IPython test suite does not mark tests with `pytest.mark.asyncio` marker directly. Instead, it globally iterates over all test functions, and implicitly adds the marker to all coroutines. This used to work with older versions, but does not work anymore — the test is correctly marked, but for some reason it stops being recognized as a coroutine. So I've made a minimal reproducer and filed a bug.
The key point here is: the (potential) bug went unnoticed for a while now, because of the premature, then obsolete pin in IPython.
#Python