• 12 Posts
  • 22 Comments
Joined 1 year ago
cake
Cake day: July 17th, 2023

help-circle







  • Be sure to check where the trackpad is. Centralized is better. My new one is more to the left and my wrist hits it when playing tf2 and I do occasionally get some movement from my wrist in game, but not much.

    There should be an option in your OS to disable the trackpad while using the keyboard. My laptop also has a trackpad to the left and I often have my hand over it when playing but never had this issue.


  • Make sure you get a laptop with a modern Ryzen processor since the battery life (and performance on battery) is often a lot better than Intel. There are a lot out there that fit the bill like Lenovo’s yoga/ideapad lineup. Just be weary of two things:

    • Some 14" laptops may have soldered RAM or SSDs making them impossible to upgrade
    • Don’t go off of processor names, they’re often pretty misleading. For example a Ryzen 7 7730U is significantly worse than a Ryzen 7 7840U.


  • It saddens me deeply that consumers (gamers) just don’t give a flying fuck about this and continues to pay a premium for Nvidia cards.

    It doesn’t help that AMD isn’t competing that much price-wise. Their only saving grace is higher VRAM, and while that is nice, raw performance is becoming less relevant. FSR also does not compete with DLSS, it’s strictly worse in every way. They also barely exist in the laptop market, I was just considering buying a new gaming laptop and my options are an RTX 4060 or paying more for the one laptop with a weaker AMD GPU.

    I would argue Intel is shaping up to be the real competitor to Nvidia. They had a rough start but their GPUs are very price-competitive. Their newer integrated GPUs are also the best currently, they’re good for gen AI, their raytracing performance trumps AMD, and XeSS is a lot better than FSR. If I were in the market for a new GPU I’d probably grab the Intel A770. I’m looking forward to their next generation of cards.




  • I still can’t understand why Google keeps hyping up Bard and then releasing it at a poor state just to ruin their reputation. First, we had:

    • Bard 1, which was hyped up to be the ChatGPT successor. It turned out to be really bad.

    • Bard 2.0, a massive update that was hyped up to make Bard so much better. It turned out to still be pretty bad (but in fairness it was a minor improvement).

    • Google Gemini, their massive response to GPT 4 that was, on paper, the best LLM in the world. They finally integrated it into Bard last month and… It’s still not great. I could not tell an immediate difference between this and the old Bard. Oh, and the videos they used to advertise Gemini Ultra were fake.

    I’m not going to armchair analyze a hugely successful company, but from my point of view it really shows how mismanaged Google has been in the past decade. Failed projects upon cancelled projects upon increasingly frustrated employees.

    /rant. Anyways, you should consider using Perplexity if you want something with search capabilities, I’ve had decent success there. Claude is also significantly better than Bard, but they made free usage very limited lately. Might be a good option if you’re willing to pay.








  • I’ve heard this a lot and I get it, but I feel like there’s a breaking point where most juniors just won’t put up with it and there will be a drought of genuinely good talent in the industry. Personally the vast majority of people I know have given up on working whatever they wanted to work in (Embedded systems, cybersecurity, gaming, etc) and just became web developers or settle for whatever “easy” jobs they could find. Ironically you catch companies that don’t hire juniors say things like “It’s so hard to find anyone that cares” or recruiters saying hiring for one spot takes months because they can’t find the perfect candidate. Something has to change imo, the path should become clearer than telling everyone to get 5 years of experience then come back when they’re ready.

    This isn’t mentioning how recruiters now rely on AI to scan a CV and filter people. It doesn’t even matter how good you are most of the time or what amazing projects you could make, you’ll get filtered if you don’t have that arbitrary thing they’re asking for.


  • popcar2@programming.devtoProgramming@programming.devGood Developers are Always in Demand
    link
    fedilink
    English
    arrow-up
    21
    arrow-down
    2
    ·
    edit-2
    10 months ago

    I know this post probably wasn’t intended to be malicious but it is insane you wrote this without realizing how it’s emanating privilege and not understanding why people can’t find a job.

    I graduated over a year ago from my CS degree. Excellent GPA, with honors. I’ve been learning game dev since college and have been (sort of) doing it professionally since graduation. I’ve done a 4-month internship, two mediocre part-time jobs, some freelancing, and I still can’t find a proper job. The industry is collapsing and the job market is flooded with talent that have a dozen years of experience. Combine that with the fact that I live in a poor country where there aren’t many game dev jobs and companies are scaling down work from home, and finding one is a nightmare.

    Let me get this straight. The blog post says you’ve been working for 10 years, maybe more. You already have insane amounts of experience and a past history with companies.

    So what did I do right?

    Maybe working in the industry for a dozen years has something to do with being able to find a job easily. If you had <5 years of experience you would have struggled to reach an interview. If you did reach an interview, someone with a more stacked CV would take that job instead. This has some “Why don’t millennials just buy a house?” energy.


  • We have a big conference every year where I live for the tech industry. It’s hit or miss depending on the person presenting, and it’s usually a miss. Many talks can last over an hour when they could’ve been a much shorter youtube video and are just there to pad time. Also 95% of the people are there for other motives. Looking for investors, trying to get hired, browsing the booths, etc. Despite being very crowded it’s very clear most of the people don’t actually care about the talks and do anything else on their phones.

    I think in-person conferences can be great experiences when done right but I really got anything out of it. For all the talks about networking with others they give very little opportunities to do that. When everyone is looking for opportunities from other people it felt almost like a competition to try and talk with companies and important people, and it usually boils down to them asking for my contact info so they can flush it down the toilet. I don’t know, I just have a bad experience with them.


  • downvotes come at a “cost”, whereby if you want to downvote someone you have to reply directly to them with some justification, say minimum number of characters, words, etc.

    I think it’s the complete opposite. Platforms with downvotes tend to be less toxic because you don’t have to reply to insane people to tell them they’re wrong, whereas platforms like Twitter get really toxic because you only see the likes, so people tend to get into fights and “ratio” them which actually increases the attention they get and spreads their message to other people.

    In general, platforms without upvotes/downvotes tend to be the most toxic imo. Platforms like old-school forums and 4chan are a complete mess because low-effort troll content is as loud as high effort thoughtful ones. It takes one person to de-rail a conversation and get people to fight about something else, but with downvotes included you just lower their visibility. It’s basically crowdsourced moderation, and it works relatively well.

    As for ways to reduce toxicity, shrug. Moderation is the only thing that really stops it but if you moderate too much then you’ll be called out for censoring people too much, and telling them not to get mad is just not going to happen.

    My idea for less toxicity is having better filtering options for things people want to see. Upon joining a platform it would give easy options to filter out communities that are political or controversial. That’s what I’m doing on Lemmy, I’m here for entertainment, not arguing.




  • Yet Markdown languages are far, far more limited in both scope and functionality than HTML is. How do you bridge this gap without making it just as complex?

    That’s a really big topic but in general I’d combine theming and markup to one language (not necessarily coupling CSS and HTML in one file but having something that does both with similar syntax and rules), make things simpler so there’s one clear way of doing something rather than using a generic container for everything, etc.

    No matter what you propose, unless it’s 100% absolutely perfect (and nothing ever is) you’ll end up in the same situation.

    Obviously deprecating a few things will happen over time but the reason web dev is how it is now is because technology used to be a lot more limited and websites were a lot simpler. 25 years ago, nobody knew what the “modern web” would look like so it was made up as people went along. We know what specifications we would need now if anybody went back and re-did them, I think you’d end up with something better.

    People say the same about no-code frameworks. There’s a good reason that stuff doesn’t work beyond the absolute basics.

    I don’t think they’re comparable. You won’t use a GUI and drag-and-drop for everything obviously, you’d still be able to add sections with code.

    The fact that Wordpress powers almost half the internet is proof that a simpler web dev experience like this is in demand and it can work. Most websites don’t need something complex, just something that supports rapid development and is intuitive, and doesn’t make it easy to fall into bad practices. Like I said, it’s a hot take, but I would prefer it so much this way.



  • Web development feels like it’s stuck in the early 2000’s. I’ve ranted a lot about it over the years but I just don’t know how everyone is okay with it. I’m sure tons of people will disagree.

    HTML is bad. The language itself feels unintuitive and is clunky compared to modern markdown languages, and let’s be honest, your webpage just consists of nested <div> tags.

    CSS is bad. Who knew styling can be so unintuitive and unmanageable? Maybe it made sense 25 years ago, but now it’s just terrible. It’s very clunkily integrated with HTML too in my opinion. Styling and markdown should be one easier to use language where 50% of it isn’t deprecated.

    Javascript has been memed to death so I won’t even go there. Typescript is OK I suppose.

    And now for my hottest take: ~10+ years ago I saw web building tools like Wix and I completely expected web development to head in the direction using a GUI to create, style, and script from one interface, even allowing you to create and see dynamic content instantly. I’ve seen competitors and waited for “the big one” that’s actually free and open source and good enough to be used professionally. It never happened. Web dev has just gone backwards and stuck in its old ways, now it’s a bloated mess that takes way more time than it deserves.

    The Godot engine is actually a pretty good option for creating GUI apps and it’s exactly what I envisioned web dev should’ve been this past decade. One language, intuitive interface, simple theming and easy rapid development… Shame it never happened.</div>