• 0 Posts
  • 15 Comments
Joined 1 year ago
cake
Cake day: July 20th, 2023

help-circle




  • You should read the article because it’s way fucking crazier than you think:

    Sources told WTVY that Michael Halstead informed officers that they had put Logan’s body into the freezer on October 11. The sources claimed that Headland Police failed to find the body which was allegedly wrapped up in blankets and tarpaulin. Halstead was also arrested that day and jailed for ten days for failing to show up to court on domestic abuse charges.

    The police KNEW there was a body in the freezer because the dad TOLD them. The police failed to find the body they knew was on the property IN A FREEZER.

    How it got in the freezer in the first place? No ine can really say not even the guy who put it there:

    Sheriff Blankenship said that Halstead claimed to have had a manic episode and couldn’t remember how the body got into the freezer.

    So yes it’s even crazier. It’s not really clear from the article what the cause of death was, but a bipolar dad that doesn’t remember exactly why they put a body in a freezer is a pretty solid suspect. Shit is absolutely wild and I’m just sitting here wondering how many freezers they had on the property for the police to not find it AFTER BEING TOLD IT WAS THERE.











  • I think it was more poking fun at the fact that the developers, not the LLM, basically didn’t do any checks for edible ingredients and just exported it straight to an LLM. What I find kind of funny is you could’ve probably exported the input validation to the LLM by asking a few specific questions about whether or not it was safe for human consumption and/or traditionally edible. Aside from that it seems like the devs would have access to a database of food items to check against since it was developed by a grocery store…

    I do agree, people are trying to shoehorn LLMs into places they really don’t belong. There also seems to be a lot of developers just straight piping input into a custom query to chatgpt and spitting out the output back to the user. It really does turn into a garbage in garbage out situation for a lot of those apps.

    On the other hand, I think this might be a somewhat reasonable use for LLMs if you spent a lot of time training it and did even the most cursory of input validation. I’m pretty sure it wouldn’t even take a ton of work to get some not completely horrendous results like the “aromatic water mix” or “rat poison sandwich” called out in the article.