Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

One other key part of this is freezing a timestamp with your dependency list, because Python packages are absolutely terrible at maintaining compatibility a year or three or five later as PyPI populates with newer and newer versions. The special toml incantation is [tool.uv] exclude-newer:

  # /// script
  # dependencies = [
  #   "requests",
  # ]
  # [tool.uv]
  # exclude-newer = "2023-10-16T00:00:00Z"
  # ///
https://docs.astral.sh/uv/guides/scripts/#improving-reproduc...

This has also let me easily reconstruct some older environments in less than a minute, when I've been version hunting for 30-60 minutes in the past. The speed of uv environment building helps a ton too.



Maybe I'm missing something, but why wouldn't you just pin to an exact version of `requests` (or whatever) instead? I think that would be equivalent in practice to limiting resolutions by release date, except that it would express your intent directly ("resolve these known working things") rather than indirectly ("resolve things from when I know they worked").


Pinning deps is a good thing, but it won't necessarily solve the issue of transitive dependencies (ie: the dependencies of requests itself for example), which will not be pinned themselves, given you don't have a lock file.

To be clear, a lock file is strictly the better option—but for single file scripts it's a bit overkill.


1 file, 2 files, N files, why does it matter how many files?

Use a lock file if you want transitive dependencies pinned.

I can't think of any other language where "I want my script to use dependencies from the Internet, pinned to precise versions" is a thing.


If there's a language that does this right, all ears? But I havn't seen it -

The use case described is for a small one off script for use in CI, or a single file script you send off to a colleague over Slack. Very, very common scenario for many of us. If your script depends on

    a => c
    b => c
You can pin versions of those direct dependencies like "a" and "b" easy enough, but 2 years later you may not get the same version of "c", unless the authors of "a" and "b" handle their dependency constraints perfectly. In practice that's really hard and never happens.

The timestamp appraoch described above isn't perfect, but would result in the same dep graph, and results, 99% of the time..


Try Scala with an Ammonite script like https://ammonite.io/#ScalaScripts . The JVM ecosystem does dependencies right, there's no need to "pin" in the first place because dependency resolution is deterministic to start with. (Upgrading to e.g. all newer patch versions of your current dependencies is easy, but you have to make an explicit action to do so, it will never happen "magically")


Rust tends to handle this well. It'll share c if possible, or split dependencies. Cargo.lock preserves exact resolution


> 1 file, 2 files, N files, why does it matter how many files?

One file is better for sharing than N, you can post it in a messenger program like Slack and easily copy-and-paste (while this becomes annoying with more than one file), or upload this somewhere without needing to compress, etc.

> I can't think of any other language where "I want my script to use dependencies from the Internet, pinned to precise versions" is a thing.

This is the same issue you would have in any other programming language. If it is fine for possibly having breakage in the future you don't need to do it, but I can understand the use case for it.


I think it's a general principle across all of software engineering that, when given the choice, fewer disparate locations in the codebase need to have correlated changes.

Documentation is hard enough, and that's often right there at exactly the same location.


> why does it matter how many files?

Because this is for scripts in ~/bin, not projects.

They need to be self-contained.


For N scripts, you will need N lock files littering your directories and then need venvs for all of them.

Sometimes, the lock files can be larger than the scripts themselves...


One could indicate implicit time-based pinning of transitive dependencies, using the time point at which the dependended-on versions were released. Not a perfect solution, but it's a possible approach.


isn't that quite exactly what the above does?


I think OP was saying to look at when the package was build instead of explicitly adding a timestamp. Of course, this would only work if you speficied `requests@1.2.3` instead of just `requests`.

This looks like a good strategy, but I wouldn't want it by default since it would be very weird to suddenly having a script pull dependencies from 1999 without explanation why.


I'm not a python packaging expert or anything but an issue I run into with lock files is they can become machine dependent (for example different flavors of torch on some machines vs others).


Oh yeah, I completely forgot about transitive dependencies. That makes perfect sense, then! Very thoughtful design/inclusion from `uv`.


Except at least for the initial run, the date-based approach is the one closer to my intent, as I don't know what specific versions I need, just that this script used to work around specific date.


Oh that's neat!

I've just gotten into the habit of using only the dependencies I really must, because python culture around compatibility is so awful


This is the feature I would most like added to rust, if you don’t save a lock file it is horrible trying to get back to the same versions of packages.


Why wouldn't you save the lock file?


Well, of course you should, but it’s easy to forget as it’s not required. It also used to be recommended to not save it, so some people put it in their gitignore.

For example, here is a post saying it was previously recommended to not save it for libraries: https://blog.rust-lang.org/2023/08/29/committing-lockfiles.h...


Gosh, thanks for sharing! This is the remaining piece I felt I was missing.


For completeness, there's also a script.py.lock file that can be checked into version controls but then you have twice as many files to maintain, and potentially lose sync as people forget about it or don't know what to do with it.


Wow, this is such an insanely useful tip. Thanks!


Why didn't you create a lock file with the versions and of course hashsums in it? No version hunting needed.


Because the aim is to have a single file, fairly short, script. Even if we glued the lock file in somehow, it would be huge!

I prefer this myself, as almost all lock files are in practice “the version of packages at this time and date”, so why not be explicit about that?


A major part of the point of PEP 723 (and the original competing design in PEP 722) is that the information a) is contained in the same physical file and b) can be produced by less sophisticated users.


That's fantastic, that's exactly what I need to revive a bit-rotten python project I am working with.


Oooh! Do you end up doing a binary search by hand and/or does uv provide tools for that?


Where would binary search come into it? In the example, the version solver just sees the world as though no versions released after `2023-10-16T00:00:00Z` existed.


I mean a binary search or a bisect over dates.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: