• 0 Posts
  • 8 Comments
Joined 5 months ago
cake
Cake day: June 10th, 2024

help-circle
  • Typically this is true, but it’s certainly possible to get comparable performance with functional style

    It’s possible, but you have to specifically write code that’s fast, rather than idiomatic or ergonomic, and you have to know what you’re doing. At that point, you may have been better off writing it in something else. I feel like OCaml is good at this because it allows you to write abstractions and main control flow in a functional way and hot paths in an imperative way without switching language, but so is Rust.

    Carp, which I linked above, basically uses the same approach to memory management as Rust. It doesn’t rely on GC.

    I’ll take a look, thanks!

    I also find that for most cases it really doesn’t matter all that much unless you’re in a specific domain like writing drivers, making a game engine, etc. Computers are plenty fast nowadays, and ergonomics tend to be more important than raw performance.

    I mostly agree with you, e.g. Haskell and Clojure, despite being “slow”, are plenty fast for what they’re used for. On the other hand, I’m very much annoyed when “user-facing” software takes way too long to load or do simple tasks. Java in particular is pretty bad at this: JOSM (Java OpenStreetMap editor) takes longer to load than my entire desktop startup, including a window manager and browser. Unfortunately it’s also the best editor around, so I pretty much have to use it to edit OSM, but it still annoys me to no end. Unnecessary computations, IO inefficiencies and layers of wrapping also affect the power consumption quite noticeably.


  • balsoft@lemmy.mltoProgrammer Humor@lemmy.mlOOP theory vs practice
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    7 hours ago

    Modern C compilers are a fascinating blend of functional and imperative, that’s true; and I didn’t say that C is “close to how the modern architectures work”. However, mainstream modern architectures are almost always engineered with C in mind primarily, and this is also acknowledged in the article you’ve linked. Rust, having a lot of similarities to C in terms of its underlying memory model, calling conventions, and control flow primitives, can often benefit from those hardware patterns and optimizations in a way that’s more difficult to replicate with a functional language (especially so given most of them are GC-d due to their memory model). The closest I’ve seen in terms of easy-to-write-quick-code is OCaml, but even there the fast paths are often written in a very much imperative style. Idris2 also seems promising if they manage to get a GC-less mode working. Maybe also Roc, but I’ve not taken a look at it yet.

    Note that I write all of this as someone spending a lot of their work time programming in a functional language (Haskell), with Rust being mostly for hobby stuff. It just always surprises me how much easier it is to write fast code in Rust, and yet also how much of my Haskell intuition was applicable when I was learning it.


  • I agree that they fit different niches! My point was that with modern CPU architectures, imperative languages make it much easier to write fast&efficient code just because the hardware was pretty much engineered with C in mind. IMHO Rust offers the best of both worlds when it comes to systems/low-level dev.


  • TBH Rust is pretty nice, it borrows (pun intended) a lot of ideas from the functional world (algebraic data types, traits, closures, affine types to an extent, composition over inheritance, and the general vibe of type-driven development), but it’s much easier to write fast, efficient code, integrate with decades of libraries in imperative languages, and the ecosystem somehow feels mature already.


  • So, here’s my attempt

    The first portion (^.?$) matches all lines of 0 or 1 characters.

    The second portion (^(..+?)\1+$) is more complicated:

    1. (..+?) is a capture group that matches the first character in any line, followed by a smallest possible non-zero number of characters such that (2) still matches (note that the minimum length of this match is 2)
    2. \1+ matches as many as possible (and more than 0) repeats of the (1) group

    I think what this does is match any line consisting of a single character with the length

    • divisible by some number (due to the more than 0 condition in (2), so that there have to be repeats in the string), that’s not
      • 1 (due to the note in (1), so that the repeating portion has to be at least 2 characters long), or
      • the length itself (due to the more than 0 condition in the (2), so that there is at least one repetition)

    Therefore, combined with the first portion, it matches all lines of the same character whose lengths are composite (non-prime) numbers? (it will also match any line of length 1, and all lines consisting of the same string repeated more than one time)