![](https://lemmy.eatsleepcode.ca/pictrs/image/a7d224d5-fcc1-4557-aad5-39d36aa370ea.png)
![](https://lemmy.world/pictrs/image/0943eca5-c4c2-4d65-acc2-7e220598f99e.png)
Thats better than live laugh liao
Thats better than live laugh liao
Crazy fast development on this app in the last 2 months.
Beta is full 🫠
The momentum of Lemmy is very impressive as of late!
Sometimes you get in too deep
What AMA? It was nothing more than 3-4 pre written comments that didn’t answer anything.
Wait… really?
I have literally 0 experience with Swift or SwiftUI but…
Relevant code and here looks to be using swift-cached-image already. Perhaps its the scaling / filtering?
I don’t have an iOS development env setup, but without the
.scaledToFill()
.blur(radius: showNsfwFilter ? 30 : 0)
.clipShape(RoundedRectangle(cornerRadius: 8))
.overlay(RoundedRectangle(cornerRadius: 8)
.stroke(Color(UIColor.secondarySystemBackground), lineWidth: 1))
this would probably be much faster given that the input url’s are unbounded/you have no control on size. Are the blur/clip/stroke all async as well?
Mlem is looking like the true successor to Apollo, amazing work team!
Studio Ghibli hits different. RiP.