Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

If only there was a world where Apple would sell M1 chips separately from their walled garden.


Well, number one, why would they? Apple makes money by getting consumers and locking them into their unicorns and rainbows ecosystem where everything is perfect which makes consumers comfortable spending boat loads of money, not by selling commodity hardware.

Ecosystems with great UX and paid subscriptions plus a 30% cut on all transactions are far more profitable than the margins you make selling commodity hardware. Just ask famous phone manufacturers like Siemens, Nokia and Blackberry why that is. That's why SW dev salaries are much higher than HW dev salaries as the former generates way more revenue than the latter. That's why Apple doesn't roll out their own cloud datacenters and instead just gets Amazon, Microsoft and Google to compete against each other on pricing.

Apple only rolls out their solutions when they have an impact on the final UX, like designing their own M1 silicon.

And number two, selling chips comes with a lot of hassle like providing support to your partners like Intel and AMD do. Pretty sure they don't want to bother with that.

Before they start selling chips I would rather they open iMessage to other platforms to eliminate the bubble color discrimination.


> Before they start selling chips I would rather they open iMessage to other platforms to eliminate the bubble color discrimination.

Outside of the countries where iOS is on par with Android (I think US, Canada and UK are the only ones, maybe also Australia) in terms of popularity, I don't know or have seen a single person using iMessage, of course there's a lot people using iphone outside of the mentioned countries, but absolutely nobody uses iMessage.

The whole discrimination of the color bubble seems to only happen in those countries were iOS is the same or more popular than android and people is actually using iMessage.


It's worse than that in the US. While iOS is a bit over 50%, it's closing in on 90% for teens[0], where such discrimination is most likely to occur. These numbers also bode well for Apple's future market share as these teens grow into adults.

0: https://finance.yahoo.com/news/apple-i-phone-ownership-among...


It's getting similar in Europe for teens. I rarely see them on public transport with anything other than an iPhone.


Go to southern countries or eastern Europe, plenty of teens with Android.


In Russia, people stopped sending each other SMS before smartphones even became mainstream. At the time they were becoming mainstream, ICQ was the instant messaging service to use, and of course there was an unofficial ICQ client for just about anything that had a screen, a keyboard, and a network interface. Also VKontakte, but that was easily accessible via Opera Mini.

Right now 99.9% of those Russians who use the internet can be reached via either VKontakte or Telegram. WhatsApp is also popular, but thankfully not around me so I was able to delete my account and never look back.


> but absolutely nobody uses iMessage

Uhm, iMessage works transparently. I just use Messages app, if my recipient uses iPhone it get an iMessage, if they use something else, they get SMS.


Their point is that most people don’t use the Messages app to communicate with others. In the UK for example WhatsApp is massively dominant.


Ditto. I'm the leper who prefers to not use whatsapp and only get away with it since my partner takes up the slack, so to speak.. Last weekend she and i where bemused by the inability of our 100% APPLE hosts to: 1. Use airprint (worked from my linux phone!!!) 2. Share a file with linux or android (mp3 for a ringtone) 3. Install a ringtone. Breathtaking. I still have my classic. And si. I have more proprietary apple software on diskettes than most geeks I know. But apple tanked long ago. And hardware is cheap. And foss is fun.


I see people using iMessage in a country where iPhone has a whopping 7% market share. And yes,


> Before they start selling chips I would rather they open iMessage to other platforms to eliminate the bubble color discrimination

It’s probably easier to just move to one of the 99% of countries where nobody uses iMessage.


I already do, in Europe, where everyone and their mom uses Facebook's WhatsApp for everything. While that evens the playing field, I'm not sure I'd call trading a walled garden for a spyware one a massive victory though.


So who cares that a network nobody uses exists where only people that have an Apple device can login?


Apparently teens and even some adults in the US where they'll miss out on social activities or be mocked or ignored due to not being on iMessage.

That doesn't affect me though as i don't live in the US and am too old for that kind of stuff but I do remember how easy it was to be mocked or bullied as a teen for not having the same stuff as the herd, even before smartphones were a thing.


How do you know if this is really a thing and not just some dramatic story of the week in the media?


It s big in the startup world too, lots of funding happened on iOS-exclusive "Clubhouse.". Black people use Android more, so it is partially back to the old racially exclusive Country Club system.


However, that has nothing to do with iMessage being exclusive to Apple devices.


I agree with you right up to how exactly does the M1 chip affect the final UX? A different keyboard, screen, touchpad, etc. all make a difference but why does the chip make a difference?


>how exactly does the M1 chip affect the final UX?

Everything runs faster, cooler, quieter and battery lasts longer. Is that not part of the product UX?


That makes it sound like Intel, AMD, ARM, etc. we’re trying to build chips that run hotter and less efficiently.


Seems like Intel really lost the plan there with every new generation having just a few percent better performance, trouble with moving to smaller nodes and the enormous regression from spectre/meltdown.

The Apple chips are made for running macOS/iOS. Seems there are some hardware instructions that are tailor made for increasing the performance of Apple software so they can make sure everything is working toward a common goal.


Not really

They are trying to compete, and have different levers to pull with varying success. When the performance per clock or per watt levers don't work well enough, then they increase the power, and the end result is heat and inefficiency.

On the flip side, integrated solutions add another lever... writing hardware that does exactly what your software needs to improve the user experience.

AMD, ARM, and even Intel have some cool, efficient solutions, but not across their whole portfolio of products, and not at the higher ends of performance. But they are always competing, incrementing and working to get closer to that ideal.

Apple was able to focus on their exact market segment and get there rapidly.


The end users don't care what brand of chip is under the hood, or why the UX on Apple's implementation of Intel chips sucked, they just know the new device has much better UX overall due to the more powerful and more efficient chip and will upgrade for that.


> I agree with you right up to how exactly does the M1 chip affect the final UX?

It allows apple to focus on what they want without being limited by and two their hardware provider’s strategy.


Power efficiency for one.


Was there nobody else who made power efficient chips?


Not in the x86 arena. Every time Apple gets involved with a CPU developers (Motorola, IBM, Intel) their needs splits from the developers desires. This time they decided to go on their own (well after years of doing this for the iPhone). Note: They have been involved in the ARM CPU market since the days of the Newton.


Many other manufacturers had made power-efficient ARM chips, however, the mainstream computer makers (just a few years ago including Apple) did choose x86 compatibility over power efficiency.


None of the other CPU manufacturers have access to the same silicon Apple does, so it's hard to say.


That would make sense in the context of Intel but anyone with money has access to TSMC 5nm (in 2021-2022). Do I misunderstand?


Just because you have money doesn't mean you have a market. Just to run a plate to create test cpus cost in the millions. All others were happy with the incremental upgrades that they were getting from ARM. Apple needed more and started creating CPUs for the iPhones a few years back


Looks like I did misunderstand, I thought they actually meant the silicon technology itself which is now available to the others and they all have designs coming using it.


> Before they start selling chips I would rather they open iMessage to other platforms to eliminate the bubble color discrimination.

When so many telcos charge outrageous prices for SMSs, it's a useful feature.


Or alternately one where some Windows / Linux manufacturer could match Apple for all the innovations in the M1 Macbooks. I'm not an Apple fan but I'm envious of what they've accomplished and wish I could run Windows and Linux on similar hardware.

Other folks are starting to get there but only from the mobile device direction, e.g. Tensor. Maybe I should look closer at what Microsoft has done with ARM Surface.


It doesn't help that Apple bought the entire manufacturing capacity for 5nm silicon from TSCM right before the chip shortage hit. I think the next few years are going to get very competitive though, and I'm excited to see how Intel and AMD respond.


Apple has done that before. IIRC when the original iPod came out it used a new generation of HDD. Apple went to the drive manufacturer and said "we'll take all of them" and they agreed.


How is Amazon able to product their arm chips for aws? Assuming those are not the 5nm?


There's still 5nm silicon for sale, but just not at TSCM (the largest semiconductor manufacturer in the world). Companies like Samsung are just now getting around to mass-producing 5nm, and afaik there were a few domestic Chinese manufacturers who claimed to be on the node too.

As for Amazon specifically though, I've got no idea. They're a large enough company that they could buy out an entire fab or foundry if they wanted, AWS makes more than enough money to cover the costs.


Nitpick: it's TSMC, Taiwan Semiconductor Manufacturing Company.


Good catch, my mind always interprets it as Taiwan Semi-Conductor Manufacturer


Yeah, I always need to think twice before writing or saying their name. Same with ASML. I guess there is a reason why TLAs are much more common than FLAs.


> all the innovations in the M1 Macbooks

What are the innovations in them? From everything I've heard, they just basically reverted all the changes most people hated for the last few years and slapped a new chip in there.


The "walled garden" comes with a C and C++ toolchain, python, perl, awk, sed, and a Unix shell. It is not, in any way, a "walled garden" in a universe where words have shared meaning.


Exactly, I cannot believe the Hacker News crowd are penalising you for correcting OP on not knowing that the walled garden metaphor specifically refers to the App Store, which is not an issue on MacOS.


No. That might have been where you first saw the concept applied, but a walled garden is a commercial ecosystem that is deliberately closed to foster a sense of value and exclusivity, usually in spite of no technical reason for it.

Walled gardens are inherently anti consumer market plays that make things worse for everyone except the people milking money from the idiots paying into the walled garden.


What part of MacOS is a walled garden? I can use any Bluetooth or USB device with it. I can install Linux on it. I can compile my own code on it. I can download applications from any source I please and install them.


I agree but would distinguish the mac from the phone? Bluetooth file transfers are a wall problem with ios. Assume you mean macosx?


Its a walled garden when you're not allowed to leave or bring your friends in, no matter how nice the stuff on the inside is.


And that analogy applies to macOS and the M1 CPU how, exactly?


What does it even mean?


The Pi has much better performance per dollar, which is a metric that's important to some people too.


Purchase dollar? Or energy dollar?


I don't think there's anything else on the planet that rivals the performance per watt of the M1 family.

Also, the RPi's SoC is made in an older 28nm process (that's one of the reasons why it's cheaper).


They won't. Their margins on the services side are obscene so getting people into that ecosystem is worth much more than the sales of some processors.


I'm hoping Alyssa Rosenzweig's fantastic work documenting the M1 GPU will let us write native Vulkan drivers even for MacOS. I believe she's been focusing thus far on the user space visible interfaces, so a lot of that work should translate well.


I'm sure there's a not so distant future where Broadcom ships a 7nm (or smaller) SoC that finds its way into the Raspberry Pi series.

It's not like Apple has a meaningful moat around state of the art silicon. And that's a Good Thing.


Also accepted would be a world where they just add Vulkan support to their APUs already.


A fully compliant Vulkan implementation for M1 would come with very surprising performance cliffs for a developer.

One of them: https://github.com/KhronosGroup/MoltenVK/issues/1244


And also potential optimisations that are not possible in other GPUs:

https://developer.apple.com/documentation/metal/gpu_features...


That's pretty common for TBDRs. The tile is rendered into a fixed size on chip buffer, and the driver has to split the tile into multiple passes to fit all of the render target data for nutty amounts of data coming out of the shader. PowerVR works the same way (completely unsurprisingly).



It'd be surprising if an architecture had 0 such surprises and did everything Vulkan allows without any special performance considerations vs another architecture.



It's fine, but it's frankly silly that you're forced to translate a free and open graphics API into a more proprietary one. Compare that to something like DXVK, which exists because Linux users cannot license DirectX on their systems. MoltenVK exists simply because Apple thought "let's not adopt the industry-wide standard for CG graphics on our newer machines". Again, not bad, but a bit of a sticky situation that is entirely predicated by technology politics, not what's actually possible on these GPUs.


Metal was released a year before Vulkan. Apple just didn't want to wait and decided to design their own better than OpenGL API.


Mantle was released ~1 year before Metal.


Proprietary to AMD, which gave it to Khronos when it was obvious they would do a second version of Longs Peak if left on their own.

Yet Vulkan, shows they cannot fix their love for extensions spaghetti.


DirectX was released a decade before Vulkan, that didn't stop manufacturers from including support for both so the user could decide for themselves.


You mean the support that is only possible because Windows backward compatibility still supports the OpenGL 1.1 ICD that they rely on?

Most of the time with crappy drivers that are a shadow of their DirectX ones?


> still supports the OpenGL 1.1 ICD that they rely on?

On Windows 11, it’s OpenGL 3.3 on top of DX12, because Qualcomm doesn’t provide an OpenGL ICD at all.

> crappy drivers

Special mention to the Intel OpenGL graphics driver on Windows. If you thought that the AMD Windows one was bad, the Intel one was somehow significantly worse.


With this gpu performance you’d think Apple might like to take advantage for gaming.


They already do, all middleware engines that actually matter, already support Metal.

Additionally iOS and Apple have much better tooling for Metal than plain DirectXTK/Pix, or that toy SDK from Khronos (that Google also uses on Android), if we compare vendor tooling.


Sounds like you don't need any help then, enjoy your 50-75% performance hit playing (a scant few) games through DirectX -> Vulkan -> Wine/Crossover ($30) -> MoltenVK -> Metal!


The games I care about enjoy native Metal and DirectX, and when I code anything graphics I don't use Khronos stuff, only on the Web, where there is no other option.

Any game dev gems have basic examples on doing an API loading layer.

Vulkan is mostly a Linux thing, and even the Switch has its own native API, NVN, it is not Vulkan nor OpenGL on the driving seat.

Here enjoy, https://www.ogre3d.org/


That’s not shown in terms of games people actually want to buy.


> It's fine, but it's frankly silly that you're forced to translate a free and open graphics API into a more proprietary one.

Thats exactly what every graphics API has done, because the underlying chip architecture is never free and open (and often, is neither).

If the issue is that you are targeting a proprietary intermediate API rather than bare metal, that is also how Nvidia's drivers work.


Is what is possible with Metal possible with GL though? Both in performance and features? They didn’t build Metal just to be contrarian.


Why? Apple has always stated they don't want to be in an enterprise like market. It stifles innovation. While you can keeps adding features to your product you can never take away from it. Ex: x86 and Windows. Meanwhile Apple has removed entire CPU functionality from their chips since the release of the iPhone 4S. This was easy because they only had to deal with their own developers. This keeps them agile and able to change from 1 release to another.


That's a bit of an ironic comment

Broadcom chips are not available on the open market and they won't sell to you unless you are an enormous company(or have a "special relationship" as RPi did). Effectively you can only buy one attached to a Pi.


Maybe they could make their own SBC, just an M1 and four USB-C ports… Apple Pi?


Why? They gave "it's possible" proof. They rip benefits of doing it first - all good. Now it's time for competition to pick it up, possibly improve on it or fade away Intel style.


No worries. Competition is coming.

https://www.phoronix.com/scan.php?page=news_item&px=SiFive-P...

Should be roughly M1 performance, but on RISC-V.


uffff

who knows when that is coming and when are we going to be able to buy regular laptops from e.g. Lenovo, HP, Acer, etc with that.

By the time that happens, Apple may already be on their third, fourth? generation on M1. Which is going to much much much faster than M1.


>Which is going to much much much faster than M1.

Will it really?

It isn't a given. They might bring amazing progress, or not.

Ultimately, it doesn't matter all that much if it isn't available to third parties. It's not as if everybody else is sitting on their ass.

SiFive's not a fat company, their research budget is tiny, relative to the likes of Apple. And yet, they're coming up with competitive cores.

Things are so much easier when not restricted by a shitty ISA (x86). I have taken a look, and I really like RISC-V; I find it better than ARM.


M1 is WAY faster than a cortex A78.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: