Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I recently took to rewriting what should be a very simple app from Obj-C to Swift with SwiftUI - because it's the future. The CPU usage was at 5% while idle, just for having a simple tiny pie chart that updates. Not to mention that for some seemingly basic things I still had to use AppKit anyway.

Wrote basically the exact same thing 1 day later in Swift with AppKit and NO SwiftUI and it sits at 0% CPU usage with less code complexity. Maybe in a few years I will give SwiftUI another try.



It’s depressing how few developers would even notice that performance degradation, let alone go back in and fix it.


Did you find out what was causing it? I have developed 'heavier' (aka 50k lines of code) applications in SwiftUI on mac and it mostly sits idle (0-1% cpu based if it is doing some regular background stuff). Heck, I just created a quick Charts based app from an online example on Mac and stays at 0,0%.

Is it possible it is re-rendering the view hierarchy due to some data invalidation you haven't noticed?


Was this with the debugger attached? Did you do any further profiling in instruments on a release scheme to determine the cause of the cpu usage? You should be able to narrow it down to the specific sys calls.


Is this app open source? I would live to poke around and see the difference both in code and performance


”Premature optimization is evil” is dogma, but you can’t keyhole optimize the architecture after the fact.

Just like cars, you can’t build a Kia Soul, and then just replace a few parts to reach Ferrari-like performance.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: