
2025-08-25 14:30:37
We have finally documented Ruff – the tool greatly simplifies static code analysis for Python projects: #Python
We have finally documented Ruff – the tool greatly simplifies static code analysis for Python projects: #Python
#Python Friday #289: Record Audio With Sounddevice
https://pythonfriday.dev/2025/07/289-record-audio-with-sounddevice/
Got the Roll Dice dialog box themed up about as nice as its going to get. Still might make it a touch smaller though.
#pathfinder2e #python #programming
SO many #Python frens that I see wayyy too rarely in less than 2mins 💔😭
https://www.youtube.com/watch?v=k6G_QIu7Im4
#Python world be like:
"Oh, hi, we wrote a new library implementing this spec."
"Hey, it looks like it doesn't conform to the spec, it doesn't pass the examples from it."
"Oh, you're right, we'll fix it ASAP."
…and that was over 3 years ago.
And yet projects keep adding a dependency on this library which has a single "pre-alpha" release 3.5 years ago and whose very first bug report points out it's incorrect.
Many of you already heard about this new algo that's faster than Dijkstra ( #python
Nice, #Python 3.9 is very soon EOL [1], so it is finally fine to use Structural Pattern Matching [2] (i.e. the `match` statement, a `switch` on steroids) everywhere! 🥳
[1] https://devguide.python.org/versions/
#Steady-CommunityContent
Mit einem einfachen #Python-Skript auf dem #RaspberryPi sollten die Leistungsdaten des EZ1-SPE
> Czysty #Python.
Zagląda do środka.
> Wpisany do pliku Pythona kod maszynowy, uruchamiany przez ctypes.
https://github.com/flababah/cpuid.py/blob/master/cpuid.p…
On the way to Berlin. I’m really looking forward to the BBQ with all the Berlin Python user groups: #Python
❓ Bei #Python denkst Du an einen fiesen Schlangenbiss und nicht an quantitative Geisteswissenschaften?
🔜 Melde Dich an für den Workshop "Python 101" am 24.09.2025 beim NFDI4Culture Community Plenary in Mainz und werde vertraut mit den Grundlagen des Programmierens!
⚠️ Registrierung:
Modern programmers: "oh, let's hijack all #Python package managers in your bashrc without asking for consent, what could possibly go wrong."
And the best joke is, I didn't even really install the package — I was just making a random bugfix and running its test suite in a virtual environment.
#Gentoo #security
So while the Z80 project on Rust is going very, very slowly, the Python side is going pretty well. Got all of the registers and flags implementations mostly done including 16-bit register swaps (EX) and full prime swapping (EXX). Using `enum.IntFlag` to set/clear flags in the F register as well which helps with legibility.
#python
Spent 30 minutes to discover #Python's csv module, which I've used often for years, defaults to CRLF on output regardless of what the source file's line endings are and what your local environment is.
I was converting a TSV to CSV and puzzling over why the resulting file was larger.
I will have to remember to set lineterminator='\n' going forward. Lossless compres…
TIL that #Python's strptime can't handle unix timestamps, wtf... 😑
time.strptime("1750420325","%s")
ValueError: 's' is a bad directive
(Yes, I know there's datetime.fromtimestamp, but here I'd much prefer to keep everything with %Y %m %d et al. due to reasons)
On the way to Ludwigshafen for a one-week workshop on Python programming with LLMs and avoiding prompt injections.
#Python #LLM #PromptInjection
Comment perdre deux heures bêtement avec #Requests #python et sa fonction Content-Type multipart :
https://notes.sklein.xyz/2025-07-17_15
Okay, spent a little time with Python's Matplotlib to get this graph of miles biked so far this year...
At some point I should be able to pull the data from Run Gap's SQLite database to automate this more, and get more granular.
I still need to figure out how to space the bars and some other formatting stuff but it's a good start!
#python
#Python Friday #283: Play Audio Files in Python
https://pythonfriday.dev/2025/06/283-play-audio-files-in-python/
Agentic Document Extraction – #Python Library for Complex Document Processing 📄
powered by https://landing.ai
🔍 Extracts structured data from complex documents with tables, pictures & charts - retu…
#Steady-CommunityContent
Ich widme mich in dieser Artikelserie den Erfahrungen mit meinem #Steckersolargerät.
Mit einem zweiten #Python-Skript sollte der
Hi Pythonistas! Previously, I was able to use the Trove classifier
Private :: Do Not Upload
to prevent packages from being accidentally uploaded to PyPI. Is there a similar option with the PEP 639 licence expression?
@… #Python
Nachdem die Frage zu spezifisch scheint: Kann jemand einen Videokurs, online Kurs, elearning etc. für #Python empfehlen?
Am Besten auf Deutsch.
Dankeschön!
:boostRequest:
https://ruhr.social/@guerda/114…
Video tutorials for modern ideas and open source tools. #python
So #Gentoo #Python eclasses are pretty modern, in the sense that they tend to follow the best practices and standards, and eventually deal with deprecations. Nevertheless, they have a long history and carry quite some historical burden, particularly regarding to naming.
The key point is that the eclasses were conceived as a replacement for the old eclasses: "distutils" and "python". Hence, much like we revision ebuilds, I've named the matching eclasses "distutils-r1" and "python-r1". For consistency, I've also used the "-r1" suffix for the remaining eclasses introduced at the time: "python-any-r1", "python-single-r1" and "python-utils-r1" — even though there were never "r0"s.
It didn't take long to realize my first mistake. I've made the multi-impl eclass effectively the "main" eclass, probably largely inspired by the previous Gentoo recommendations. However, in the end I've found out that for the most use cases (i.e. where "distutils-r1" is not involved), there is no real need for multi-impl, and it makes things much harder. So if I were naming them today, I would have named it "python-multi", to indicate the specific use case — and either avoid designating a default at all, or made "python-single" the default.
What aged even worse is the "distutils-r1" eclass. Admittedly, back when it was conceived, distutils was still largely a thing — and there were people (like me) who avoided unnecessary dependency on setuptools. Of course, nowadays it has been entirely devoured by setuptools, and with #PEP517 even "setuptools" wouldn't be a good name anymore. Nowadays, people are getting confused why they are supposed to use "distutils-r1" for, say, Hatchling.
Admittedly, this is something I could have done differently — PEP517 support was a major migration, and involved an explicit switch. Instead of adding DISTUTILS_USE_PEP517 (what a self-contradictory name) variable, I could have forked the eclass. Why didn't I do that? Because there used to be a lot of code shared between the two paths. Of course, over time they diverged more, and eventually I've dropped the legacy support — but the opportunity to rename was lost.
In fact, as a semi-related fact, I've recognized another design problem with the eclass earlier — I should have gone for two eclasses rather than one: a "python-phase" eclass with generic sub-phase support, and a "distutils" (or later "python-pep517") implementing default sub-phases for the common backends. And again, this is precisely how I could have solved the code reuse problem when I introduced PEP517 support.
But then, I didn't anticipate how the eclasses would end up looking like in the end — and I can't really predict what new challenges the Python ecosystem is going to bring us. And I think it's too late to rename or split stuff — too much busywork on everyone.
no shade, but if your import path starts with `src.`, you haven’t understood the purpose of an src directory.
The fact that VS Code gets this wrong when fixing imports in tests is concerning.
#Python
After some refactoring, learning about `hatch`, moving more files around, and generally abusing `test.pypi.org`: I've uploaded `diceparse` to PyPI. Still need to update the web documentation, but it now feels like a proper project at this point.
I still need to add a CLI part so you can just roll dice after installing the package, but I'll handle that later. Also need to tweak the README.md a bit as well...
Je viens de découvrir la fonction "inspect.cleandoc" de la librairie standard de #Python.
https://notes.sklein.xyz/2025-06-13_1437/
#Python Friday #288: 5 Helpful Tricks for #UV
https://pythonfriday.dev/2025/07/288-5-hel…
The latest uv release (0.8.13) introduced the experimental uv format. It calls Ruff’s formatter to automatically style your code: https://github.com/astral-sh/uv/pull/15017. And you can use [tool.ruff] for both entry points.
Long shot in case anyone seeing this knows this stuff better than I...
Playing around with setting socket.IPV6_RECVPKTINFO in #Python, which is useful for UDP-based listeners to get the local address a datagram is sent to.
From what I can tell in my initial testing, the PKTINFO structure contains the address, but not the interface index like I thought it should. I don't care abo…
I still need to get around to automating this stuff a bit more, but for now I just type a few numbers each month for my biking stats...
#bikeTooter #python
> Pure #Python.
Looks inside.
> Inline machine code ran via ctypes.
https://github.com/flababah/cpuid.py/blob/master/cpuid.py
New in #Python world: #setuptools now vendors deep dependencies with LGPL license. Not that I do mind (but some people and companies do!) — but these dependencies aren't even used! I mean, `autocommand` is just a dependency of some scripts in `jaraco.text` that aren't used by setuptools.
Oh, wait, I actually do care, because I need to fix LICENSE in dev-python/ensurepip-setuptools.
https://github.com/pypa/setuptools/issues/5045
https://github.com/pypa/setuptools/issues/5049
Got the documentation for the dice parser software online and now I can close the laptop and relax a bit. I think it still needs some cleaning up, but it's good enough for now.
#python
I've drafted support for verification of #PyPI provenance for #Gentoo.
You know, the new fancy thing that protects against supply chain attacks on PyPI, and verifies that you're using genuine #GitHub artifacts. Because, you know, GitHub repositories and deployment pipelines are an unlikely attack vector. And you definitely don't need to worry about #Microsoft owning the keys, the repositories and the pipelines at all.
#security #Python #SigStore
Yes, yes, please fork #PythonRequests and a bunch of other high-profile #Python libraries as its dependencies, and add some more #NIH dependencies to that. Oh, yes, and definitely overwrite the original packages in the process! What could possibly go wrong?
#packaging
#Python Friday #282: Working With Temporary Files
https://pythonfriday.dev/2025/06/282-working-with-temporary-files/
One of the goals I've set for further development of #Python eclasses in #Gentoo was to avoid needless complexity. Unfortunately, the subject matter sometimes requires them. However, many of the functions added lately were already manually done in ebuilds for years.
We've started disabling plugin autoloading years ago. First we just did that for individual packages that caused issues. Then, for these where tests ended up being really slow. Finally, pretty much anywhere `python_test()` was declared. Doing it all manually was particularly cumbersome — all I needed for `EPYTEST_PLUGINS` is a good idea how to generalize it.
Similarly, `EPYTEST_XDIST` was added after we have been adding manually `epytest -p xdist -n "$(makeopts_jobs)" --dist=worksteal` — and while at it, I've added `EPYTEST_JOBS` to override the job count.
Perhaps `EPYTEST_TIMEOUT` wasn't that common. However, it was meant to help CI systems that could otherwise get stuck on hanging test.
Similarly, "standard library" version (like `3.9`) matching to `python_gen_cond_dep` was added after a long period of explicitly stating `python3_9 pypy3`. As an extra benefit, this also resolved the problem that at the time `pypy3` could mean different Python versions.
#Python Friday #290: Record Audio With #PyAudio
https://pythonfriday.dev/2025/08/290-r
New on #Quansight PBC blog: Python Wheels: from Tags to Variants
#Python distributions are uniform across different Python versions and platforms. For these distributions, it is sufficient to publish a single wheel that can be installed everywhere. However, some packages are more complex than that; they include compiled Python extensions or binaries. In order to robustly deploy these software on different platforms, you need to publish multiple binary packages, and the installers need to select the one that fits the platform used best.
For a long time, Python wheels made do with a relatively simple mechanism to describe the needed variance: Platform compatibility tags. These tags identified different Python implementations and versions, operating systems, and CPU architectures. Over time, they were extended to facilitate new use cases. To list a couple: PEP 513 added manylinux tags to standardize the core library dependencies on GNU/Linux systems, and PEP 656 added musllinux tags to facilitate Linux systems with musl libc.
However, not all new use cases can be handled effectively within the framework of tags. To list a few:
• The advent of GPU-backed computing made distinguishing different acceleration frameworks such as NVIDIA CUDA or AMD ROCm important.
• As the compatibility with older CPUs became less desirable, many distributions have set baselines for their binary packages to x86-64-v2 microarchitecture level, and Python packages need to be able to express the same requirement.
• Numerical libraries support different BLAS/LAPACK, MPI, OpenMP providers, and wish to enable the users to choose the build matching their desired provider.
While tags could technically be bent to facilitate all these use cases, they would grow quite baroque, and, critically, every change to tags needs to be implemented in all installers and package-related tooling separately, making the adoption difficult.
Facing these limitations, software vendors have employed different solutions to work around the lack of an appropriate mechanism. Eventually, the #WheelNext initiative took up the challenge to design a more robust solution.
"""
#packaging
Kolejna "poważna" luka bezpieczeństwa w języku #Python.
I po raz kolejny informacja o niej trafiła do sekcji "Library" pliku NEWS, a nie do "Security".
https://www.cve.org/CVERecord?id=…
In other news, I've sent a few fun patches to improve epytest in #Gentoo.
This includes forcing short summaries, creating junit .xml for machine processing, and most importantly, EPYTEST_PLUGINS to handle specifying the plugins to load. The goal is to eventually move away from plugin autoloading by default.
#PyTest #Python
Works with #Python #JavaScript #TypeScript #Go supporting frameworks like
Paczka Pythona, której nie idzie zainstalować na Pythonie 3.14, bo autor koniecznie musiał zaimplementować przetwarzanie AST na ponad 200 linii w `setup.py`? Dlaczego nie.
#Gentoo #Python #setuptools
Nowadays in quality #Python: #Gentoo is running #ProtoBuf-related test suite via #PyTest-forked to workaround protobuf segfaulting during GC.
Of course, it implies random programs can segfault on exit too.
https://github.com/protocolbuffers/protobuf/issues/22067
https://gitweb.gentoo.org/repo/gentoo.git/tree/dev-python/protobuf/protobuf-6.31.1.ebuild?id=54e20d4bb0ec99ab868695a2980c4307d179cb10#n150
New on blog: "EPYTEST_PLUGINS and other goodies now in #Gentoo"
"""
If you are following the gentoo-dev mailing list, you may have noticed that there’s been a fair number of patches sent for the #Python eclasses recently. Most of them have been centered on #pytest support. Long story short, I’ve came up with what I believed to be a reasonably good design, and decided it’s time to stop manually repeating all the good practices in every ebuild separately.
In this post, I am going to shortly summarize all the recently added options. As always, they are all also documented in the Gentoo Python Guide.
"""
https://blogs.gentoo.org/mgorny/2025/07/26/epytest_plugins-and-other-goodies-now-in-gentoo/
#Matplotlib has a lot of "image comparison tests" that are horrible fragile. Technically, most of them permit some deviation from the reference images, but quite often I've been getting higher RMS than that. So for a long time, we've been maintaining patches that increased the tolerance in tests, and regularly either had to be rebased and updated for new tests.
At some point upstream started adding conditions permitting higher tolerance on non-x86_64 platforms. Of course, these changes forced me to rebase our patches. Curious enough, my previous overrides often happened to be close to the tolerance given for non-x86_64 platforms.
Today, it finally occurred to me that instead of updating the patch once again, I can try dropping it entirely and just sed-ing all `platform.machine() == 'x86_64'` with `False`. And guess what — down to 3 failures (related to TeΧ). And I don't have to spend 15 minutes manually doing what effectively accounted to the same thing.
#Gentoo #Python
AHAHAHA COME AT ME AI
#programming #python #mos6502 #assembly
Some fun facts about #Python limited API / stable ABI.
1. #CPython supports "limited API". When you use it, you get extensions that are compatible with the specified CPython version and versions newer than that. To indicate this compatibility, such extensions use `.abi3.so` suffix (or equivalent) rather than the usual `.cpython-313-x86_64-linux-gnu.so` or alike.
2. The actual support is split between CPython itself and #PEP517 build systems. For example, if you use #setuptools and specify `py_limited_api=` argument to the extension, setuptools will pass appropriate C compiler flags and swap extension suffix. There's a similar support in #meson, and probably other build systems.
3. Except that CPython freethreading builds don't support stable ABI right now, so building with "limited API" triggers an explicit error from the headers. Setuptools have opted for building explicit about this: it emits an error if you try to use `py_limited_api` on a freethreading interpreter. Meson currently just gives the compile error. This implies that package authors need to actively special-case freethreading builds and enable "limited API" conditionally.
4. A some future versions of CPython will support "limited API" in freethreading builds. I haven't been following the discussions closely, but I suspect that it will only be possible when you target that version or newer. So I guess people will need to be building two stable ABI wheels for a time — one targeting older Python versions, and one targeting newer versions plus freethreading. On top of that, all these projects will need to update their "no 'limited API' on freethreading" conditions.
5. And then there's #PyPy. PyPy does not feature a stable ABI, but it allows you to build extensions using "limited API". So setuptools and meson just detect that there is no `.abi3.so` on PyPy, and use regular suffix for the extensions built with "limited API".
Someone asked about fixing #setuptools deprecation warnings today (#Gentoo collects and reprints them, so people actually notice). I think my best hint is: use another build system.
#Python #PEP517
Yet another "HIGH severity" vulnerability in #Python.
Once again found in "Library" section of the NEWS, not in "#Security".
https://www.cve.org/CVERecord?id=CVE-2025-8194
https://github.com/python/cpython/pull/137027/files#diff-27f72e5ff09b9527a57610751506f7e37d371a2d55b1305b96dcefb9f2d6cf1e
When you spend an hour backporting #CPython #security fixes to all versions of #Python #Gentoo, because there was no planned security release, and a few hours later you spend time again bumping to the unexpected security releases.
And then you are surprised why you didn't mask Python 3.8 yet, and repeat the same mistake.
Oh, and ofc update your CPython and PyPy (fixed PyPy only in Gentoo).
A #Python package that can't be installed on Python 3.14, because the author had to implement a 200 line custom AST parser in `setup.py`? Yeah, why not.
#Gentoo #packaging #setuptools
A while ago, I've followed the example given by #Fedora and unbundled ensurepip wheels from #Python in #Gentoo (just checked — "a while ago" was 3 years ago). This had the important advantage that it enabled us to update these wheels along with the actual pip and setuptools packages, meaning new virtual environments would get fresh versions rather than whatever CPython happened to bundle at the time of release.
I had considered using our system packages to prepare these wheels, but since we were already unbundling dependencies back then, that couldn't work. So I just went with fetching upstream wheels from PyPI. Why not build them from source instead? Well, besides feeling unnecessary (it's not like the PyPI wheels are actually binary packages), we probably didn't have the right kind of eclass support for that at the time.
Inspired by @…, today I've tried preparing new revisions of ensurepip packages that actually do build everything from source. So what changed, and why should building from source matter now? Firstly, as part of the wheel reuse patches, we do have a reasonably clean architecture to grab the wheels created as part of the PEP517 build. Secondly, since we're unbundling dependencies from pip and setuptools, we're effectively testing different packages than these installed as ensurepip wheels — and so it would be meaningful to test both variants. Thirdly, building from source is going to make patching easier, and at the very least enable user patching.
While at it, I've refreshed the test suite runs in all three regular packages (pip, setuptools and wheel — we need an "ensurepip" wheel for the last because of test suites). And of course, I hit some test failures in testing the versions with bundled dependencies, and I've discovered a random bug in #PyPy.
https://github.com/gentoo/gentoo/pull/42882 (yes, we haven't moved yet)
https://github.com/pypy/pypy/issues/5306
New reason not to use #PythonPoetry just dropped: they reinvented "reproducible builds", poorly. The problem is, they missed the purpose of reproducible builds entirely and they use it for source distributions too, and when you don't use SOURCE_DATE_EPOCH, they force all files to epoch (as in timestamp 0) instead of leaving them alone.
Like, all source distributions created by Poetry and uploaded to #PyPI now have 1970 timestamps that, simply speaking, break stuff. The most absurd thing is that ZIP can't handle that timestamp, so they override it and use another date for wheels 🤦.
#Gentoo #PEP517
When you use #RustLang to write safe code, but what you get is a data corruption #heisenbug instead.
#Gentoo #Python
A bad #Python bump morning in #Gentoo:
1. A project that couldn't be bothered to make a release with a security fix for 4 years finally made a release. Of course, if you make one release in 7 years, it is definitely a good idea to replace your build system with a broken #PythonPoetry #setuptools hybrid.
2. Another project made a release with a bunch of test failures — that were fixed in "master" branch already at the time, but I guess nobody bothered testing the release branch.
3. Just discovered that a bunch of projects are using pkg_resources namespaces again — and we were supposed to have gotten rid of them years ago! Of course it's #Google. And on top of that, since pkg_resources are now throwing deprecation warnings, they are indirectly breaking random other test suites.
On the positive side, test_lolwut is failing for me in redis-py.
#Python #packaging be like:
"Remember the totally random #PyTest plugin that died in 2018, that we forced you to add to #Gentoo, because we decided to start using it for no good reason? Well, we just stopped. Also, we just found a #NIH plugin that reinvents flaky test handling for the third time, enjoy!"
(Fortunately, it's compatible enough with pytest-rerunfailures, so we can ignore it.)
#PyYAML rejected #freethreading support. As a result, a new fork has been created with freethreading support. Given the fork's focus on freethreading, it supports only Python 3.13 . Given the lack of environment markers for freethreading (yet), packages end up depending on PyYAML-ft for >=3.13 (including non-freethreading builds), and PyYAML for <3.13.
Isn't #Python #packaging great?
#Gentoo