Well, number one, why would they? Apple makes money by getting consumers and locking them into their unicorns and rainbows ecosystem where everything is perfect which makes consumers comfortable spending boat loads of money, not by selling commodity hardware.
Ecosystems with great UX and paid subscriptions plus a 30% cut on all transactions are far more profitable than the margins you make selling commodity hardware. Just ask famous phone manufacturers like Siemens, Nokia and Blackberry why that is. That's why SW dev salaries are much higher than HW dev salaries as the former generates way more revenue than the latter. That's why Apple doesn't roll out their own cloud datacenters and instead just gets Amazon, Microsoft and Google to compete against each other on pricing.
Apple only rolls out their solutions when they have an impact on the final UX, like designing their own M1 silicon.
And number two, selling chips comes with a lot of hassle like providing support to your partners like Intel and AMD do. Pretty sure they don't want to bother with that.
Before they start selling chips I would rather they open iMessage to other platforms to eliminate the bubble color discrimination.
> Before they start selling chips I would rather they open iMessage to other platforms to eliminate the bubble color discrimination.
Outside of the countries where iOS is on par with Android (I think US, Canada and UK are the only ones, maybe also Australia) in terms of popularity, I don't know or have seen a single person using iMessage, of course there's a lot people using iphone outside of the mentioned countries, but absolutely nobody uses iMessage.
The whole discrimination of the color bubble seems to only happen in those countries were iOS is the same or more popular than android and people is actually using iMessage.
It's worse than that in the US. While iOS is a bit over 50%, it's closing in on 90% for teens[0], where such discrimination is most likely to occur. These numbers also bode well for Apple's future market share as these teens grow into adults.
In Russia, people stopped sending each other SMS before smartphones even became mainstream. At the time they were becoming mainstream, ICQ was the instant messaging service to use, and of course there was an unofficial ICQ client for just about anything that had a screen, a keyboard, and a network interface. Also VKontakte, but that was easily accessible via Opera Mini.
Right now 99.9% of those Russians who use the internet can be reached via either VKontakte or Telegram. WhatsApp is also popular, but thankfully not around me so I was able to delete my account and never look back.
Ditto. I'm the leper who prefers to not use whatsapp and only get away with it since my partner takes up the slack, so to speak.. Last weekend she and i where bemused by the inability of our 100% APPLE hosts to: 1. Use airprint (worked from my linux phone!!!) 2. Share a file with linux or android (mp3 for a ringtone) 3. Install a ringtone. Breathtaking. I still have my classic. And si. I have more proprietary apple software on diskettes than most geeks I know. But apple tanked long ago. And hardware is cheap. And foss is fun.
I already do, in Europe, where everyone and their mom uses Facebook's WhatsApp for everything. While that evens the playing field, I'm not sure I'd call trading a walled garden for a spyware one a massive victory though.
Apparently teens and even some adults in the US where they'll miss out on social activities or be mocked or ignored due to not being on iMessage.
That doesn't affect me though as i don't live in the US and am too old for that kind of stuff but I do remember how easy it was to be mocked or bullied as a teen for not having the same stuff as the herd, even before smartphones were a thing.
It s big in the startup world too, lots of funding happened on iOS-exclusive "Clubhouse.". Black people use Android more, so it is partially back to the old racially exclusive Country Club system.
I agree with you right up to how exactly does the M1 chip affect the final UX? A different keyboard, screen, touchpad, etc. all make a difference but why does the chip make a difference?
Seems like Intel really lost the plan there with every new generation having just a few percent better performance, trouble with moving to smaller nodes and the enormous regression from spectre/meltdown.
The Apple chips are made for running macOS/iOS. Seems there are some hardware instructions that are tailor made for increasing the performance of Apple software so they can make sure everything is working toward a common goal.
They are trying to compete, and have different levers to pull with varying success. When the performance per clock or per watt levers don't work well enough, then they increase the power, and the end result is heat and inefficiency.
On the flip side, integrated solutions add another lever... writing hardware that does exactly what your software needs to improve the user experience.
AMD, ARM, and even Intel have some cool, efficient solutions, but not across their whole portfolio of products, and not at the higher ends of performance. But they are always competing, incrementing and working to get closer to that ideal.
Apple was able to focus on their exact market segment and get there rapidly.
The end users don't care what brand of chip is under the hood, or why the UX on Apple's implementation of Intel chips sucked, they just know the new device has much better UX overall due to the more powerful and more efficient chip and will upgrade for that.
Not in the x86 arena. Every time Apple gets involved with a CPU developers (Motorola, IBM, Intel) their needs splits from the developers desires. This time they decided to go on their own (well after years of doing this for the iPhone). Note: They have been involved in the ARM CPU market since the days of the Newton.
Many other manufacturers had made power-efficient ARM chips, however, the mainstream computer makers (just a few years ago including Apple) did choose x86 compatibility over power efficiency.
Just because you have money doesn't mean you have a market. Just to run a plate to create test cpus cost in the millions. All others were happy with the incremental upgrades that they were getting from ARM. Apple needed more and started creating CPUs for the iPhones a few years back
Looks like I did misunderstand, I thought they actually meant the silicon technology itself which is now available to the others and they all have designs coming using it.
Or alternately one where some Windows / Linux manufacturer could match Apple for all the innovations in the M1 Macbooks. I'm not an Apple fan but I'm envious of what they've accomplished and wish I could run Windows and Linux on similar hardware.
Other folks are starting to get there but only from the mobile device direction, e.g. Tensor. Maybe I should look closer at what Microsoft has done with ARM Surface.
It doesn't help that Apple bought the entire manufacturing capacity for 5nm silicon from TSCM right before the chip shortage hit. I think the next few years are going to get very competitive though, and I'm excited to see how Intel and AMD respond.
Apple has done that before. IIRC when the original iPod came out it used a new generation of HDD. Apple went to the drive manufacturer and said "we'll take all of them" and they agreed.
There's still 5nm silicon for sale, but just not at TSCM (the largest semiconductor manufacturer in the world). Companies like Samsung are just now getting around to mass-producing 5nm, and afaik there were a few domestic Chinese manufacturers who claimed to be on the node too.
As for Amazon specifically though, I've got no idea. They're a large enough company that they could buy out an entire fab or foundry if they wanted, AWS makes more than enough money to cover the costs.
Yeah, I always need to think twice before writing or saying their name. Same with ASML. I guess there is a reason why TLAs are much more common than FLAs.
What are the innovations in them? From everything I've heard, they just basically reverted all the changes most people hated for the last few years and slapped a new chip in there.
The "walled garden" comes with a C and C++ toolchain, python, perl, awk, sed, and a Unix shell. It is not, in any way, a "walled garden" in a universe where words have shared meaning.
Exactly, I cannot believe the Hacker News crowd are penalising you for correcting OP on not knowing that the walled garden metaphor specifically refers to the App Store, which is not an issue on MacOS.
No. That might have been where you first saw the concept applied, but a walled garden is a commercial ecosystem that is deliberately closed to foster a sense of value and exclusivity, usually in spite of no technical reason for it.
Walled gardens are inherently anti consumer market plays that make things worse for everyone except the people milking money from the idiots paying into the walled garden.
What part of MacOS is a walled garden? I can use any Bluetooth or USB device with it. I can install Linux on it. I can compile my own code on it. I can download applications from any source I please and install them.
I'm hoping Alyssa Rosenzweig's fantastic work documenting the M1 GPU will let us write native Vulkan drivers even for MacOS. I believe she's been focusing thus far on the user space visible interfaces, so a lot of that work should translate well.
That's pretty common for TBDRs. The tile is rendered into a fixed size on chip buffer, and the driver has to split the tile into multiple passes to fit all of the render target data for nutty amounts of data coming out of the shader. PowerVR works the same way (completely unsurprisingly).
It'd be surprising if an architecture had 0 such surprises and did everything Vulkan allows without any special performance considerations vs another architecture.
It's fine, but it's frankly silly that you're forced to translate a free and open graphics API into a more proprietary one. Compare that to something like DXVK, which exists because Linux users cannot license DirectX on their systems. MoltenVK exists simply because Apple thought "let's not adopt the industry-wide standard for CG graphics on our newer machines". Again, not bad, but a bit of a sticky situation that is entirely predicated by technology politics, not what's actually possible on these GPUs.
> still supports the OpenGL 1.1 ICD that they rely on?
On Windows 11, it’s OpenGL 3.3 on top of DX12, because Qualcomm doesn’t provide an OpenGL ICD at all.
> crappy drivers
Special mention to the Intel OpenGL graphics driver on Windows. If you thought that the AMD Windows one was bad, the Intel one was somehow significantly worse.
They already do, all middleware engines that actually matter, already support Metal.
Additionally iOS and Apple have much better tooling for Metal than plain DirectXTK/Pix, or that toy SDK from Khronos (that Google also uses on Android), if we compare vendor tooling.
Sounds like you don't need any help then, enjoy your 50-75% performance hit playing (a scant few) games through DirectX -> Vulkan -> Wine/Crossover ($30) -> MoltenVK -> Metal!
The games I care about enjoy native Metal and DirectX, and when I code anything graphics I don't use Khronos stuff, only on the Web, where there is no other option.
Any game dev gems have basic examples on doing an API loading layer.
Vulkan is mostly a Linux thing, and even the Switch has its own native API, NVN, it is not Vulkan nor OpenGL on the driving seat.
Why? Apple has always stated they don't want to be in an enterprise like market. It stifles innovation. While you can keeps adding features to your product you can never take away from it. Ex: x86 and Windows. Meanwhile Apple has removed entire CPU functionality from their chips since the release of the iPhone 4S. This was easy because they only had to deal with their own developers. This keeps them agile and able to change from 1 release to another.
Broadcom chips are not available on the open market and they won't sell to you unless you are an enormous company(or have a "special relationship" as RPi did). Effectively you can only buy one attached to a Pi.
Why? They gave "it's possible" proof. They rip benefits of doing it first - all good. Now it's time for competition to pick it up, possibly improve on it or fade away Intel style.