maegul (he/they)

A little bit of neuroscience and a little bit of computing

  • 72 Posts
  • 881 Comments
Joined 2 years ago
cake
Cake day: January 19th, 2023

help-circle



  • Absolutely. It’s a shit show.

    And interestingly, making the general public more aware of this is likely quite important. Because 1, they have very idealistic views of what research is like, and 2, just about everyone is entering research blind to the realities. It’s a situation that needs some sunlight and rethinking.

    IMO, a root cause is that the heroic genius researcher ideal at the base of the system’s design basically doesn’t really exist any more. Things are just too big and complex now for a single person to be that important. Dismantle that ideal and redesign from scratch.



  • It’s definitely an interesting and relevant idea I think! A major flaw here is the lack of ability for communities to establish themselves as discrete spaces desperate from the doomscrolling crowd.

    A problem with the fediverse on the whole IMO, as community building is IMO what it should be focusing on.

    Generally decentralisation makes things like this difficult, AFAIU. Lemmy has things like private and local only communities in the works that will get you there. But then discovery becomes a problem which probably requires some additional features too.


  • The catch is that the whole system is effectively centralised on BlueSky backend services (basically the relay). So while the protocol may be standardised and open, and interpreted with decentralised components, they’ll control the core service. Which means they can unilaterally decide to introduce profitable things like ads and charging for features.

    The promise of the system though is that it provides for various levels of independence that can all connect to each other, so people with different needs and capabilities can all find their spot in the ecosystem. Whether that happens is a big question. Generally I’d say I’m optimistic about the ideas and architecture, but unsure about whether the community around it will get it to what I think it should be.



  • I think for python tooling the choice is Python Vs Rust. C isn’t in the mix either.

    That seems fair. Though I recall Mumba making headway (at least in the anaconda / conda space) and it is a C++ project. AFAIU, their underlying internals have now been folded into conda, which would mean a fairly popular, and arguably successful portion of the tooling ecosystem (I tended to reach for conda and recommend the same to many) is reliant on a C++ foundation.

    On the whole, I imagine this is a good thing as the biggest issue Conda had was performance when trying to resolve packaging environments and versions.

    So, including C++ as part of C (which is probably fair for the purposes of this discussion), I don’t think C is out of the mix either. Should there ever be a push to fold something into core python, using C would probably come back into the picture too.


    I think there’s a survivor bias going on here.

    Your survivorship bias point on rust makes a lot of sense … there’s certainly some push back against its evangelists and that’s fair (as someone who’s learnt the language a bit). Though I think it’s fair to point out the success stories are “survivorship” stories worth noting.

    But it seems we probably come back to whether fundamental tooling should be done in python or a more performant stack. And I think we just disagree here. I want the tooling to “just work” and work well and personally don’t hold nearly as much interest in being able to contribute to it as I do any other python project. If that can be done in python, all the better, but I’m personally not convinced (my experience with conda, while it was a pure python project, is informative for me here)

    Personally I think python should have paid more attention to both built-in tooling (again, I think it’s important to point out how much of this is simply Guido’s “I don’t want to do that” that probably wouldn’t be tolerated these days) and built-in options for more performance (by maybe taking pypy and JIT-ing more seriously).

    Maybe the GIL-less work and more performant python tricks coming down the line will make your argument more compelling to people like me.

    (Thanks very much for the chat BTW, I personally appreciate your perspective as much as I’m arguing with you)


  • Yep! And likely the lesson to take from it for Python in general. The general utility of a singular foundation that the rest of the ecosystem can be built out from.

    Even that it’s compiled is kinda beside the point. There could have been a single Python tool written in Python and bundled with its own Python runtime. But Guido never wanted to do projects and package management and so it’s been left as the one battery definitely not included.


  • I feel like this is conflating two questions now.

    1. Whether to use a non-Python language where appropriate
    2. Whether to use rust over C, which is already heavily used and fundamental in the ecosystem (I think we can put cython and Fortran to the side)

    I think these questions are mostly independent.

    If the chief criterion is accessibility to the Python user base, issue 2 isn’t a problem IMO. One could argue, as does @eraclito@feddit.it in this thread, that in fact rust provides benefits along these lines that C doesn’t. Rust being influenced by Python adds weight to that. Either way though, people like and want to program in rust and have provided marked success so far in the Python ecosystem (as eraclito cites). It’s still a new-ish language, but if the core issue is C v Rust, it’s probably best to address it on those terms.


  • Fair, but at some point the “dream” breaks down. Python itself is written in C and plenty of packages, some vital, rely on C or Cython (or fortran) and rust now more and more. So why not the tooling that’s used all the time and doing some hard work and often in build/testing cycles?

    If Guido had packaging and project management included in the standard library from ages ago, with parts written in C, no one would bat an eye lid whether users could contribute to that part of the system. Instead, they’d celebrate the “batteries included”, “ease of use” and “zen”-like achievements of the language.

    Somewhere in Simon’s blog post he links to a blog post by Armin on this point, which is that the aim is to “win”, to make a singular tool that is better than all the others and which becomes the standard that everyone uses so that the language can move on from this era of chaos. With that motive, the ability for everyday users to contribute is no longer a priority.


  • Cool to see so many peeps on the Fedi!

    While I haven’t used uv (been kinda out of Python for a while), and I understand the concerns some have, the Python community getting concerned about good package/project management tooling is certainly a telling “choice” about how much senior Python devs have gotten used to their ecosystem. Somewhat ditto about concern over using a more performant language for fundamental tooling (rather than pursuing the dream of writing everything in Python, which is now surely dead).

    So Simon is probably right in saying (in agreement with others):

    while the risk of corporate capture for a crucial aspect of the Python packaging and onboarding ecosystem is a legitimate concern, the amount of progress that has been made here in a relatively short time combined with the open license and quality of the underlying code keeps me optimistic that uv will be a net positive for Python overall

    Concerns over maintainability should Astral go down may best be served by learning rust and establishing best practices around writing Python tooling in compiled languages to ensure future maintainability and composability.



  • Just recently read your 2017 article on the different parts of the “Free Network”, where it was new to me just how much the Star Trek federation was used and invoked. So definitely interesting to see that here too!

    Aesthetically, the fedigram is clearly the most appealing out of all of these. For me at least.

    It seems though that using the pentagram may have been a misstep given how controversial it seems to be (easy to forget if you’re not in those sort of spaces). I liked the less pentagram styled versions at the bottom. I wonder if a different geometry could be used?


  • Yea, absolutely. Just as the Omicron variant broke out, I came to the conclusion that we had basically added a new virus on top of Influenza. One that was generally worse and not at all seasonal like Influenza. And that the net effect was and is humanity’s global prosperity taking an easily measurable hit. Before Omicron, IMO, there was hope that we could beat it.

    The way Omicron happened and the way we responded (IE “didn’t you know the pandemic is over!”) made it pretty clear what people’s attitudes were. The only thing that really got us through the pandemic was bio-medical scientists nailing their job, and it will be the only thing that will help us, and as much cause for celebration that is, the dark side is that hyper-individualism has prevented any other faculties to help out in the mean time (or, as I said above, prevent us from going down a darker path).

    I formative experience for me was that amongst my internet COVID bubble (mostly a particular sub-reddit) I was the first to hear about Omicron and share news about its many mutations (this was before it really spread anyway and just when the first analysis of its genome came out). The response from people, who were generally concerned enough about the pandemic to be on a sub-reddit about it, was entirely dismissive. “Fear mongering”, “viruses evolve toward being less severe” … etc … were the universal and popular response. All when it was black and white as anything … more mutations meant more immune escape, just like with influenza from year to year.

    But no one wanted to hear that. They were all done with the lockdowns and panic and had subscribed, essentially religiously, to the “promised coming of the end of the pandemic”. And then the variant turned out to have plenty of immune escape, spread like wildfire, and in my area cause the greatest spate of deaths in the whole fucking pandemic. Now it’s probably fair to say that it was the second pandemic and we welcomed it with arms wide open (all variants since are direct descendants of omicron, AFAIU, and omicron itself was not descended from any of the other variants around at the time but a separate branch from or near the original strain).


  • The 5 year mark of the pandemic is just around the corner now. And it’s interesting to reflect on how well things are going compared to early forecasts.

    My memory is that 3-5 years was put out there as the likely longest horizon for the pandemic. Objectively, it’s seems pretty clear that it has not gone away at all and that any progress on actually reducing its prevalence is either speculative (eg new nasal vaccines) or ”unacceptable” civil or infrastructural measures (masks, remote work, air filters etc).

    All of which is basically a failure.

    Another way of cutting it though might be to view the Omicron variant as a second pandemic that is proving generally worse than the first in part because it’s catching us at our most indifferent.

    I feel like there was a point there where a good vaccine roll out could have contained the delta or preceding variants. Which to me only highlights how all of the civil measures we were taking and could have taken were not just about maximising health at that time but also about preventing us from going down a darker path of no return which seems to be where we are now. If global measures were taken to limit the spread of the virus and so prevent its evolution, I’d wonder how good of a chance there’d be that a vaccine could then have quashed the virus.




  • Cheers for the shout out! Yea the idea of that community is to be a kind of study group.

    Whenever I’ve posted a thought or idea, that’s part question part experiment part pondering, I’ve gotten great replies from others.

    Also two people have been running twitch streams of running through the book. Sorrybook is nearly done I think (they’ve been going for half a year now which is quite impressive).

    The community is at a point now I suspect where some of us have learnt rust well enough to spread out into projects etc, so it’d be nice to work out how we can do that together at all.

    Part of my initial idea with the community was to then have a study group for working through the lemmy codebase, treating it as a helpfully relevant learning opportunity … as we’re all using it, we all probably have features we’d like to add, and the devs and users of it are all right here for feedback.

    Additionally, an idea I’ve been mulling over, one which I’d be interested in feedback on … is running further “learning rust” sessions where some of us, including those of us who’ve just “learned” it, actually try to help teach it to new comers.

    Having a foundation of material such as “The Book” would make a lot of sense. Where “local teachers” could contribute I think is in posting their own thoughts and perspectives on what is important to take away, what additional ideas, structures or broader connections are worth remembering, and even coming up with little exercises that “learners” could go through and then get feedback on from the “teachers”.