• 0 Posts
  • 486 Comments
Joined 1 year ago
cake
Cake day: June 20th, 2023

help-circle
  • If it’s part of the Requirements that the frontend should handle “No results found” differently from “Not authorized”, even if that’s just by showing an icon, then ach list of stuff which might or not be authorized should have a flag signalling that.

    (This is simply data analysis - if certain information is supposed to be shown to the user it should come from somewhere and hence the frontend must get it from somewhere, and the frontend code trying to “deduce it” from data it gets is generally prone to the kind of problem you just got because unless explicitly agreed and documented, sooner or later some deduction done by one team is not going to match what the other team is doing. Generally it’s safer just to explicitly pass that info in a field for that purpose to avoid frontend-backend integration issues).

    Authorization logic is almost always a responsibility of the backend (for various reasons, including proper security practices) and for the frontend it’s generally irrelevant why it’s authorized or not, unless you have to somehow display per-list the reason for a it being authorized or not, which would be a strange UI design IMHO - generally there’s but a flag in the main part of the UI and a separate page/screen with detailed authorization information - if the user really wants to dig down into the “why” - which would be using different API call just to fill in that page/screen.

    So if indeed it is required that the frontend knows if an empty result is due to “Not Authorized” rather than “No results found” (a not uncommon design, though generally a good UI design practice is to simply not even give the user access to listing things the user is not authorized to see rather than let the user chose them and then telling them they’re not authorized to do it, as the latter design is more frustrating for users) that info should be an explicit entry in what comes from the backend.

    The JSON is indeed different in both cases, but if handled correctly it shouldn’t matter.

    That said, IMHO, if all those 3 fields in your example should be present, the backend should be putting a list on all 3 fields even if for some the list is empty, rather than a null in some - it doesn’t matter what the JSON is since even at the Java backend level, a List variable with a “null” is not the same as a List variable with a List of length 0 - null vs empty list is quite a common source of mistakes even within the code of just the one tier, though worse if it ends up in API data.

    Who is wrong or right ultimately depends on the API design having marked those fields as mandatory or optional.


  • That sounds like an error in the specification of the client-server API or an erroneous implementation on the server side for the last version: nothing should be signaled via presence or absence of fields when using JSON exactly because, as I described in my last post, the standard with JSON is that stuff that is not present should be ignore (i.e. it has no meaning at all) for backwards compatibility, which breaks if all of the sudden presence or absence are treated as having meaning.

    Frankly that there isn’t a specific field signalling authorized/not-authorized leads me to believe that whomever has designed that API isn’t exactly experienced at that level of software design: authorization information should be explicit, not implicit, otherwise you end up with people checking for not-in-spec side effects like you did exactly for that reason (i.e. “is the no data being returned because of user not authorized or because there was indeed no data to retunr?”), which is prone to break since not being properly part of the spec means any of the teams working on it might interpret things differently and/or change them at any moment.


  • If I remember it correctly, per the JSON definition when a key is present but not expected it should be ignored.

    The reason for that is to maintain compatibility between versions: it should be possible to add more entries to the data and yet old versions of the software that consumes that data should still continue to operate if all the data they’re designed to handle is still there and still in the correct format.

    Sure, that’s not a problem in the blessed world of web-based frontends where the user browser just pulls the client code from the server so frontend and backend are always in synch, but is a problem for all other kinds of frontend out there where the life-cycle of the client application and the server one are different - good luck getting all your users to update their mobile apps or whatever whenever you want to add functionality (and hence data in client-server comms) to that system.

    (Comms API compatibility is actually one of the big problems in client-server systems development)

    So it sounds like an issue with the way your JavaScript library handles JSON or your own implementation not handling per-spec the presence of data which you don’t use.

    Granted, if the server side dev only makes stuff for your frontend, then he or she needs not be an asshole about it and can be more accomodating. If however that data also has to serve other clients, then I’m afraid you’re the one in the wrong since you’re demanding that the backwards compatibility from the JSON spec itself is not used by anybody else - which as I pointed out is a massive problem when you can’t guarantee that all client apps get updated as soon as the server gets updated - because you couldn’t be arsed to do your implementation correctly.


  • I still remember well how the New Labour faction, in order to attack Corbyn, deemed him an anti-semite by association because in a conference he sat on a panel with a guy who compared the actions of Israel to those of the Nazis. Turns out said guy was a Jewish Holocaust survivor, so they they were saying that a Jewish Holocaust survivor was an anti-semite in order to slander Corbyn by association.

    Also quite a lot of Israel-linked “Jewish” associations cooperated in that whole campaign of slandering Corbyn and Labour as “anti-semite” (remember how the Labour Party was said to have “many” anti-semites, and they turned out to be in a lower proportion than society in general and way less than the Tories) to bring Corbyn down and replace him with somebody from the New Labour faction, which succeeded.

    The New Labour leadership has quite the debt to the Zionists for that, so don’t expect them to be any less unwaveringly supportive of a genocidal ethno-Fascist Israel than the Tories - at best it’s going to be like Biden in American: claim to want Bibi to stop all the while by action de facto supporting him.











  • I came here to say the same.

    People in the technical career track spend most of their time making software, one way or another (there comes a point were you’re doing more preparation to code than actual coding).

    As soon as you jump into the management career track it’s mostly meetings to report the team’s progress to upper management, even if you’re supposedly “technically oriented”.

    Absolutelly, as you become a more senior tech things become more and more about figuring out what needs to be done at higher and higher levels (i.e. systems design, software development process design) which results in needing to interact with more and more stakeholders (your whole team, other teams, end users, management) hence more meetings, but you still get to do lots of coding or at least code-adjacent stuff (i.e. design).


  • Aceticon@lemmy.worldtoProgrammer Humor@programming.devAny Volunteers
    link
    fedilink
    arrow-up
    8
    arrow-down
    1
    ·
    edit-2
    2 months ago

    The inability to detail the idea all the way down to the level were something concrete can be made from it kills it well before the lack of coding skills.

    It’s like what separates having an idea for a book and writting an actual book that is enjoyable to read: there is no “knowing how to code” barrier in there and yet most people can’t actually pull it off when they try or it ends up shallow and uninteresting.


  • Don’t take this badly but it sounds like you’ve only seen a tiny slice of the software development done out there and had some really bad experiences with Agile in it.

    It’s perfectly understandable: there are probably more bad uses of Agile out there than good ones and certain areas of software development tend to be dominated by environments which are big bloody “amateur hour every hour of the day, every day of the year” messes, Agile or no Agile.

    That does however not mean that your experience stands for the entirety of what’s out there trumphing even the experience of other people who also work in QA in environments where Agile is used.


  • Aceticon@lemmy.worldtoProgrammer Humor@lemmy.mlUsers
    link
    fedilink
    arrow-up
    2
    arrow-down
    1
    ·
    edit-2
    2 months ago

    Agile was definitelly taken in with the same irrationality as fashion at some point.

    It’s probably the best software development process philosophy for certain environments (for example: were there are fast changing requirements and easy access to end users) whilst being pretty shit for others (good luck trying to fit it at a proceess level when some software development is outsourced to independent teams or using for high performance systems design) and it eventually mostly came out of that fad period being used more for the right things (even if, often, less that properly) and less for the wrong things.

    That said the Agile as fad phase was over a decade ago.


  • Aceticon@lemmy.worldtoProgrammer Humor@lemmy.mlUsers
    link
    fedilink
    arrow-up
    2
    arrow-down
    1
    ·
    edit-2
    2 months ago

    Agile made Management, who had actual Senior Designer-Developers and Technical Architects designing and adjusting actual development processes, think that they had this silver bullet software development recipe that worked for everything so they didn’t need those more senior (read more expensive and unwilling to accpet the same level of exploitation as the more junior types) people anymore.

    It also drove the part of the Tech Industry that relies mainly on young and inexperienced techies and management (*cough* Startups *cough*) to think they didn’t need experienced techies.

    As usual it turned out that “there are no silver bullets”, things are more complex, Agile doesn’t work well for everything and various individual practices of it only make sense in some cases (and in some are even required for the rest to work properly) whilst in others are massive wasting of time (and in some cases, the usefull-wasteful balance depends on frequency and timing), plus in some situations (outsourced development) they’re extremelly hard or even impossible to pull at a project scope.

    That said, I bet that what you think is “The Industry” is mainly Tech companies in the US rather than were most software development occurs: large non-Tech companies with with a high dependency of software for competitive advantage - such as Banks - and hence more than enough specific software requirements to hire vast software development departments to in-house develop custom solutions for their specific needs.

    Big companies whose success depends on their core business-side employees doing their work properly care a lot more about software not breaking or even delaying their business processes (and hence hire QA to figure out those problems in new software before it even gets to the business users) than Tech companies providing software to non-paying retail users who aren’t even their customers (the customers are the advertisers they sell access to the eyeballs of those users) and hence will shovel just about anything out and hopefully sort out the bugs and lousy UX/UI design through A/B testing and user bug-reports.


  • Aceticon@lemmy.worldtoProgrammer Humor@lemmy.mlUsers
    link
    fedilink
    arrow-up
    5
    ·
    edit-2
    2 months ago

    Yeah.

    Any good software developer is going to account for and even test all the weird situations they can think of … and not the ones they cannot think of as they’re not even aware of those as a possibility (if they were they would account for and test them).

    Which is why you want somebody with a different mindset to independently come up with their own situations.

    It’s not a value judgment on the quality of the developer, it’s just accounting for, at a software development process level, the fact that humans are not all knowing, not even devs ;)


  • Aceticon@lemmy.worldtoProgrammer Humor@lemmy.mlUsers
    link
    fedilink
    arrow-up
    1
    ·
    edit-2
    2 months ago

    “Wrong way” for whom?

    In Software Development it ultimatelly boils down to “are making software for the end users or are you making it for yourself?”

    Because in your example, that’s what ultimatelly defines whose “wrong” the developer is supposed to guide him/herself by.

    (So yeah, making software for fun or you own personal use is going to follow quite different requirement criteria than making software for use by other people).