• 1 Post
  • 50 Comments
Joined 1 year ago
cake
Cake day: June 15th, 2023

help-circle
  • insomniac_lemon@kbin.socialtoProgramming@programming.dev...
    link
    fedilink
    arrow-up
    1
    ·
    edit-2
    5 months ago

    Yeah, the only language I’ve seen/tried that actually feels right*.

    But for me it falls down when it comes to needing other people and/or the specific engine-level stuff that I want to get started. I was hoping to start out simple with Raylib bindings, but even that I can’t get vertex colors on imported models to work and I tinkered w/my own 2D polygon format but it was too much work for me to finalize.

    My part of the fediverse doesn’t seem to work well for asking niche questions at least, I don’t see much talk on Nim and it doesn’t help that it’s hard to find when people don’t say nim-lang. Also there are 2 replies to you that aren’t federated to where I can see them (and my art threads–lowpoly+vertex colors, for instance plant–aren’t federated to your instance).

    *= That also may be a mix of my issues plus how some people style their code, though.



  • EDIT: Personal stuff aside, you might have better luck in a programming community, particularly in a programming-centric instance (though IME federation w/Kbin may be an issue), and probably even better with an example of an article.

    Then again, it might be an uphill battle when it comes to actually convincing people to use your place over others.


    I don’t like most programming languages, and I want probably too much. Problem with me or the timeline or something.

    I do like one language and it’s a bit niche.
    Even still I don’t really use it. The last code I did (months ago, in September), load format example.

    Also recently made a few 3D models with only vertex colors (no textures) such as plant and that didn’t work exactly as hoped with Raylib (bindings-specific? maybe not?) either. Guess old tech isn’t as supported as I thought it would be.

    That’s all I have to post now.



  • I haven’t seen a lot and don’t really read (but it’s empty here anyway) but

    I’d like to see more related to see cyborg stuff, beyond just prosthesis and aesthetics. The type of thing where there isn’t much organic left besides the brain and symbiotic systems. Living in different forms.

    @Gamers_Mate


    Meta notes:

    • GitS probably has some of this and I should probably watch, but not really the feel I’m hoping for especially with religious/existential themes
    • Farscape has some of this with the leviathans, but also a string of bad events related (that ship needs therapy)
    • lovable robots often have the feel down, and while none of the explanation is there it often could work in a headcanon sort-of-way especially when backstory is not explained fully




  • EDIT: The fact that I don’t dream in 1st-person is probably the most relevant bit here. For others I guess I’d say try looking into a mirror to make sure you’re you, at least if you can remember who you are and that other people are not you (therefore if you’re them it cannot be reality).

    Though lacking experience, I don’t know if mirrors in dreams have common effects though if they just didn’t work in dreams sounds like something I may have heard before.


    Probably unhelpful, but I do not dream with enough clarity for that to be an issue. The more vivid ones I’ve had seem to be shorter (I’ve had a dream once that was basically just a still picture with moving colors), everything else is usually just weird and at-best might be mistaken for a cheesy movie. I also cannot recall any from my own (or any) 1st-person perspective, even if the dreams might have details or themes from my own life.

    Lack-of-detail/vividness may be related to me having aphantasia, but it also might be an issue with REM sleep due to health issues particularly if I don’t remember having a dream even long before I’ve woken up.


  • e.g. eog can do it, although it won’t preserve size of the image

    Yeah, that is the case. But it wouldn’t be an issue if it could properly zoom-to-fit. I also do not like the CSD (for my current setup with a very minimal custom XFWM window theme at least).

    Geeqie has poor zoom too but not zoom resetting, but it updates the image too slowly for my liking (and there doesn’t appear to be a setting, maybe it’s a performance thing) with watch set to 0.1.

    Feh has even worse scaling (the window size is set to the image size), but change a few (non-persistent) settings and it works and updates quickly. Though I don’t want this for my image viewer. I guess I could use it just for this, though.


    On a side note, a similar-sort of idea I’ve had to this is using a game controller (I have a Steam controller) for its analog controls to use alongside a mouse (or cheap drawing tablet/monitor) to get more drawing functionality*.

    It seems like something possible with evdev or maybe xinput particularly with a remapping tool probably, but I tried it with SC-controller (a while ago) and it didn’t seem to work.

    *= For instance pressure (or flow) on trigger, rotation on trackpad, angle on gyro, and then extra buttons on that side can be used for other shortcuts.



  • That’s really good, thank you! The only thing I’m not liking is that with the default terminal size I couldn’t see the error telling me that xbr=8 is not supported (so all I saw was the output just not updating).

    Got it working with my image viewer but I need to refresh it with a click and F5.

    Haven’t tried it with a live image refresh, but tomorrow I’ll update my system and try Geeqie as that seems to have reloading too (my current image viewer seems to be dead and they point towards that as an alternative). Or something else.

    Too bad Krita doesn’t have auto-refreshing references (though there may be a way to do that in some way that I’m unaware of).

    Image preview for non-Kbin

    EDIT: I tried writing the alt-text twice but I don’t see it



  • monkeys with a 3rd robot arm

    Not sure if it’s the same, but I see a video of that and the monkey’s arms are partially restricted and still moving (and another where it says reenactment at the start). Interesting, but it might just be a cloned signal rather than independent control.

    Though I guess swapping between sets and some basic controls (hold, gimbal, return to rest pose etc) wouldn’t be bad (especially the more naturally it can be controlled) it just seems like something different if it isn’t independent control.

    full-brain mesh of electrodes, could allow people to use multiple full bodies at once
    or that multiple brains couldn’t be connected and made work in parallel (brain hemispheres already do that

    I’ve had the exact opposite thought, multiple brains (in the sense of multiple people) residing in the same body. Usage shifts (to allow rest), partial control, or even simply observation/eyes-in-the-back-of-your-head/backup/advice/talking etc.

    That definitely would allow at least 4 arms.

    On a sidenote, in the Blender Open Movie CHARGE there’s a cool robot design where it starts out with 1 big (no-hand) arm and 2 little arms on the other side and then it transforms that into 2 normal arms.


  • I often think about a skeuomorphic VR experience. Like a virtual room inside your own head that doesn’t cut you off from senses available to your body, at most it’d just be presented in a different way much like the cartoon/trope (though things like hearing/smell/temperature could definitely stay direct). Even then, I’m not sure if certain things like tilt or momentum etc should be represented or if that should just be always-on.

    Though for me I’d want it to mostly just be the equivalent of a body tracker (plus mouse/KB/controller emulation) that’s hooked up to a single-board-computer that can be more easily swapped out/upgraded etc (or use any normal desktop). As in no internet directly to the brain. Which would be good enough to play all of today’s VR games and jump out of it easier than taking off a headset and trackers.

    Direct input of a computer screen would probably be easier and good enough most of the time, though. Then again, it might be cool to invite people into your brain house. Also in some cases imagine controlling your body with dials/levers and/or coordinates (and visualized data) but also still feeling it.






  • My thinking is that there’s also physical health issues and other issues that make physical activity less viable. Human travel (walking/biking) would be a big help*, or just more time/space/money/comfort/motivation(s)** (alternatively: if healthy options were more of a norm/incentive rather than a lucrative market to chase) which is even less likely than changes to zoning/density and infrastructure.

    In any case sure, improving someone’s life in 1 aspect will provide benefits… but is anyone actually going to help with that or is it going to just result in more of the same platitudes that are already heard? I don’t think any study has much chance to create policy in the USA any decade soon.

    *=That’s from experience… I’m in a semi-rural area, started biking right before the trail closed for renovation ~6months ago. Still closed, no ETA other than “early 2024”.

    **=Aside from health/personal/travel reasons, maybe it’s for a hobby. Getting something out of the activity (money, electric, usable mechanical energy) would be nice if it weren’t a problem of cost/storage/loss/logistics etc.

    EDIT: And I should say Bowling Alone is in force here too, but again money is probably a big part of that too.