• 34 Posts
  • 217 Comments
Joined il y a 1 an
cake
Cake day: 9 juin 2023

help-circle





  • Intel’s big shift was to maintain compatibility as improvements were made and new fab nodes introduced. No one else did this very well. The actual baseline for this change was the 16 bit i8086 thus the reason we call it x86. A program written for an 8086 should still work on a brand new 14900 i9.

    Motorola was the big backwards endian device. They did lots of odd things too, like major possessive egomaniacal like business decisions.

    A couple of the key persons behind the microprocessor are Frederico Faggin (https://en.m.wikipedia.org/wiki/Federico_Faggin). He’s the guy behind the Intel 4004 (first microprocessor), Intel 8080, Zilog Z80

    Bill Mensch (https://en.m.wikipedia.org/w/index.php?title=Bill_Mensch) He’s the guy behind the Motorola 6800 and MOS 6502

    I have no idea where people are saying the 6502 has anything to do with ARM. ARM stands for Acorn RISC Machine and later Advanced RISC Machine. RISC is a fundamentally different architecture from CISC.

    The 6502 wasn’t really positioned in this RISC/CISC paradigm, it was simply dirt cheap when everyone else was much much more expensive. Its only real innovation was the extremely primitive pipeline where the next instruction is loaded at the same time one is executed. This is because their quality was too bad to compete with the higher frequency devices from other companies. It was a clever hack to make things cheaper at the time. The 6502 is still present in some form in Western Digital products (also Bill Mensch).

    CISC was the old guard, RISC is from Berkeley, while MIPS is from Stanford. (https://en.m.wikipedia.org/wiki/Reduced_instruction_set_computer)

    ARM is a RISC architecture and that traces its history back to completely different origins than the other microprocessors.

    The funny thing, the Arithmetic Logic Unit (ALU= CPU secret sauce where the action happens) in modern Intel processors is a RISC design with a CISC wrapper.


  • Not really an easy thing to describe in ELI5.

    PC started out in an era where documented hardware and specifically second sourcing of hardware was important. It was fully documented from the start. Fully documented actually means you can fully own the device. There is no software depreciation mechanism or ulterior motives where someone can spy on you on the background. It is more complicated now because some parts of x86 are undocumented now too, but it isn’t abused like other architectures.

    ARM is a proprietary IP and chip design firm. They don’t really have anything to do with this stage, but they are proprietary and are set up to support others that are proprietary as well. Like you can get assembly language documentation for the base ARM architecture, but you still won’t know all the exact implementation details and peripheral device blocks on the die.

    Google took open source software like Linux, prepared it so that manufacturers could add their hardware modules (drivers) at the last possible minute as binaries only. This is called an orphan kernel. While the majority of software on the device is open source, none of the source code for these kernel modules is open source. This is the depreciation mechanism used to steal ownership. No one can ever update that orphan kernel without the source code for the specific kernel modules to run the device. Sometimes you’ll find a device supported by custom ROMs long after the device is depreciated. Generally this means someone is doing an enormous task of trying to back port changes and security patches from the present all the way back to the state of the old kernel at the time the last binaries were compiled with the kernel.

    The alternative is to merge the source code with the kernel. Once this is done, the community is likely to maintain the kernel modules for a very long time, like decades. Every phone is a little bit different, so reverse engineering one does nothing for the next.

    There is more to it still. From the flip side, chip fabs are the most expensive commercial human endeavor in history. They require an enormous up front investment and your devices largely fund the endeavor. This is a major part of the world economic growth. Like the USA was a military spending driven economy until the 1960’s. The reason large scale conflict largely ended for the USA has been because of the shift to venture capital and that shift happened in the 1960’s because of silicon valley.

    So it is a balance between economic growth and the fundamental human right of ownership along with your awareness and expectations in this area. If you do not recognize that you’ve lost ownership over your property or care, the concept of democracy weakens substantially. You’ve lost autonomy and that can feel wrong.






  • I’m hanging cannibalism on the end of extreme, but intending on the broader scope of most extreme behaviors. It is easier to approach than the sexual and predatory counterparts.

    Like if there is no potential “greater” social authority likely to interfere, is there a population density that determines overall accountability? Is it the randomness of personalities and spectrums? Is there any evidence of a change over time and social evolution?

    They are hard questions. I wonder if any observational evidence exists around the various dwindling native groups that exist(ed) in various degrees of isolation. It is also a question of how fixation, paranoia, and anxiety may have evolved in the human species over time. It would be really interesting to be able to contrast this kind of behavior potential now versus the deep past.









  • Late but… The USA failed to regulate 5G well enough. It would have forced telecoms to use steeper frequently filters that are more accurate like what is used in the rest of the responsible world. The 5G frequency band butts up against the hydrogen band used by weather satellites. (IIRC) The study that the FCC commissioned said something like a failure to isolate and protect the hydrogen line would set back US weather forecasting accuracy to around the level it was in 1970. As usual, the red jihadist party had absolutely no qualms about such a technological setback, took their political bribes, and failed to regulate to protect the hydrogen line. In their defense, radio is magic, and sky wizard didn’t have any objections via thoughts and prayers.


  • Exhaust products and a small amount of either O2 or unburnt. IIRC SpaceX is a fuel rich cycle, so mostly +unburned fuel.

    Think of it kinda like you’re seeing an isolated atmosphere made by the rocket suspended in a place nearly without atmosphere. It is like a cloud of atmosphere in space where there is no atmosphere.

    It’s mostly in the sun which is super hot without the filter of an atmosphere and how it buffers temperature. As soon is the particles are below the shadow of the Earth, they get super cold and likely freeze.

    The rocket is clearly not at orbital velocity yet and that stuff is going backwards fast, so it will all deorbit fairly quickly.

    You’re not seeing turbulent flow quite like what happens on the ground or what is seen in other parts of the exhaust plume because there is not very much pressure in the surrounding region to create the Eddy currents that make the mixing/chaotic flow patterns seen within another medium. I don’t think it is entirely linear flow, but it is much closer to linear flow than what happens in a thick atmosphere.


  • You are seeing the exhaust plume while it is still dense enough to reflect enough light from the sun.

    There are several massive thermal layers in the atmosphere that effectively make isolation barriers at various heights. That is why the exhaust on the left appears unique in structure within a certain boundary. The upper layers of the atmosphere get really hot before getting really cold again. Like commercial jets fly in the cold part, but it gets hot, then cold above that. The rocket plume on the right is in that upper cold region; the outer most puffy/sparse/low Earth orbit region. You can tell because of how enormous the exhaust plume is expanding when there is very little atmospheric pressure to contain it.

    There is very little atmosphere way up there and certainly not enough to produce Rayleigh scattering. If there was enough to produce Rayleigh scattering the exhaust plume would be hard to see with very little contrast against the background, but without, it makes a much higher contrast view against the mostly empty void of LEO space.






  • I’ve thought about messing with something in an old router with OpenWRT since they have a maintained Python repo and there are documented I/O for buttons and LED’s, along with SPI and UART that are broken out.

    I also had an old laptop where the board came with several unpopulated I/O. The board came in multiple configurations and I had the base model, so it had a bunch of connections I could have reverse engineered (something I am much more confident doing rather than software). I was curious about the potential to break out these connections but knew it was beyond my abilities at the time. Now that comp is not needed so messing about is much more feasible.

    I’ve got a raspberry π. The point is not to have something that just works, but to understand what just works really means. And like, how to interact more dynamically with a microcontroller with less protocol formality where I tend to get lost in the weeds when I have some simple need and don’t want the overhead of all that complexity.