I’m glad to see for once the fines are proportional to revenue, and not a fixed amount. 6% hurts.
I’m glad to see for once the fines are proportional to revenue, and not a fixed amount. 6% hurts.
There is a theory based around how ocean tankers’ exhaust historically included sulfates, which can actually seed cloud formation.
Recent emissions regulations reduced this effect, so fewer clouds are being seeded over the ocean, and the oceans are absorbing more sunlight and heating more.
So we were basically painting large swaths of reflective clouds over the oceans, masking the heating. And now we’re seeing unencumbered heating effects.
It’s especially infuriating when you have a giant like Microsoft rolling Electron on their flagship applications (Teams), and then deprecating support for some platforms (Linux). What’s the point of your nice, memory-gobbling, platform-agnostic app framework if you’re not even going to use it to provide cross platform support?
This is how the legal system typically works when a new law is introduced.
If an instance is merely blocked, does that mean all content produced by that instance, or by a Lemmy.World user using that instance, is strictly not stored on Lemmy.World servers?
Otherwise there might still be liability. Also, in the US you don’t even have to do anything illegal to be the target of a lawsuit—distancing from piracy is a practical defense against the cost of legal proceedings, even if it’s technically legal.
Amdahl’s isn’t the only scaling law in the books.
Gustafson’s scaling law looks at how the hypothetical maximum work a computer could perform scales with parallelism—idea being for certain tasks like simulations (or, to your point, even consumer devices to some extent) which can scale to fully utilize, this is a real improvement.
Amdahl’s takes a fixed program, considers what portion is parallelizable, and tells you the speed up from additional parallelism in your hardware.
One tells you how much a processor might do, the only tells you how fast a program might run. Neither is wrong, but both are incomplete picture of the colloquial “performance” of a modern device.
Amdahl’s is the one you find emphasized by a Comp Arch 101 course, because it corrects the intuitive error of assuming you can double the cores and get half the runtime. I only encountered Gustafson’s law in a high performance architecture course, and it really only holds for certain types of workloads.
ABSOLUTELY. I just recently capped off the Diff Eq, Signals, and Controls courses for my undergrad, and truly by the end you feel like a wizard. It’s crazy how much problem-solving/system modeling power there is in such a (relatively) simple, easy to apply, and beautifully elegant mathematical tool.
I’m still a big fan of D for personal projects, but I fear the widespread adoption ship has sailed at this point, and we won’t see the language grow anymore. It’s truly a beautiful, well-rounded language.
Also just recently a rather prominent contributor forked the entire compiler/language so we’re seeing more fragmentation :/