The Wingnut AR demo that Apple showed during WWDC featuring the Unreal Engine was just mindblowing! Thankfully, it's available on Unreal's Youtube channel so you can still watch if you don't have Safari or the app - https://www.youtube.com/watch?v=S14AVwaBF-Y
This is nice but what's unique about this? Is it significantly better or more performant? Is it just nice to have an integrated solution?
I'd love to see a breakdown of ARKit vs something like Vuforia. Here's a video with the equivalent functionality to the WWDC demo -- https://www.youtube.com/watch?v=JvE_7filGsY
>This is nice but what's unique about this? Is it significantly better or more performant? Is it just nice to have an integrated solution?
Ignoring all that other stuff, it's officially sanctioned and supported, part of the platform, requires no external libs, has Cocoa level documentation, and tons of developers will be using it very soon.
It all depends on whether Apple makes ARKit a priority, though. Apple has advertised SpriteKit on various occasions, then mostly broke it in iOS 9.0/9.1 [1], [2]. Game Center went down once a single game (Letterpress) started using it as designed [3]. What will you do if ARKit breaks in iOS 12.0? If you use an third-party lib, at least you can roll back to an earlier version, or even fix the bug yourself if it is open source.
I would stay as far away from Apple as possible when it comes to gamedev tools.
> part of the platform, requires no external libs [...]
Sadly this also means Apple will only update it in subsequent versions of the OS. This is a big deal when choosing a target platform, e.g. ARKit 11 vs ARKit 12.
For example when iOS 12 comes out your target audience will be split, some with ARKit 11 and some with 12. You'll have to decide if you want the new features in 12 or ability to run on devices still using iOS 11.
Compare with other game platforms, which insulate you from underlying tech changes. For example a Unity app can be updated and use the latest Unity features, even on an old iOS.
[FWIW Apple could update ARKit independently of iOS, but that's not been their pattern]
The building destruction physics were pretty impressive. Maybe I'm out of touch with modern games engines but I've not seen anything quite like that before.
This kind of thing is the least interesting application of AR in my opinion. I want high-quality information overlays about the real world. I do understand that some people just want Starcraft on a coffee table, but it seems like a low ambition for Apple to showcase.
>This kind of thing is the least interesting application of AR in my opinion. I want high-quality information overlays about the real world. I do understand that some people just want Starcraft on a coffee table, but it seems like a low ambition for Apple to showcase.
It makes for a compelling proof of concept. Having a lot of individual things interacting with the environment independently is most of what you need, technically, for the high-quality overlays you're talking about. If you can do the former you can do the latter, it's just a sexier way of showing off the capabilities.
I don't get the point of seeing anything in the background of the table, in this case the audience. Why not paint it with sky, then it's less distracting?
Oh, I get it, then it's not AR. Well, maybe AR is not such a good fit in this case to begin with.
The it's just watching a 360 youtube video on your phone.
This is still new and doesn't quite make full use, but there are many reason why you'd want to see the environment. For one, it gives a sense of scale. For example if you put a real life size dinosaur, seeing the surrounding allows you to understand the scale better. Next up, the AR application could very well interact with the world too, which brings new possibilities.
These are pretty new technology and no one knows the best way to use it yet. Maybe in the future we'll start putting a sky in the background. Maybe we'll find better ways to use AR. Both AR and VR are still in their infancy.
Good point, except that in this example, you don't understand the scale better (since all the people and spaceships fit on a table). I believe (for this example) an immersive experience, with VR goggles would be much better.
I definitely agree. The examples Google gave of Tango were much more inline with the points I was speaking of. But again, I'd love to see what new creative things others come up with.
It demonstrates the power they have with AR. If they can show a town appear on a table with air strikes coming in in realtime, it implies that simpler more practical AR use cases will be much easier.
Prior to this, as a developer, you might think AR is limited to what Pokemon GO and the first example (like a cup of coffee on a table) show. After seeing this, if you were thinking of doing anything simpler than what they were showing, you'd likely see it as much more feasible than you previously did and perhaps look into including some of those features.
And this was at WWDC (their developer conference), so it certainly applies to a developer audience.
The content itself wasn't that big of a deal; it's the fact that the AR technology worked so well, and is available to anyone using the SDK is what was impressive and hype-worthy.
Doing AR from scratch is a huge endeavour but this should make it simple enough so that we start seeing more applications using it for more useful stuff.
I thought Samsung S8 should have AR on a premium phones.
Compare to 60-120fps on iOS 11, yup, Apple Metal 2 could boost by reduce latency from 16.6ms to 8.xms and even more with direct display, it matched PlayStation VR refresh rate, real world tracking is what we need.
https://www.youtube.com/watch?v=Yphh1Ue3D6g
Also, Apple has almost never been the first to do anything. Tim Cook even admitted this yesterday. What Apple brings to the table is an actual polished solution. Which having used Hololens extensively it absolutely is not.
Well, Apple has just released the 'smart keyboard' for its ipads, the 'smart' bit being that with a physical connection, "it doesn't need charging and automatically transfers data". Such innovation! Something keyboards have been lacking for decades...
I know I have been using thin and light portable keyboards that magnetically snap to my PC that also aligns the power/data connection since the early 80's
So you're saying the 'smart' bit is the magnets? Because I don't know what keyboards you were using, but the power and data connections were always aligned when I connected up my keyboards.
Unfortunately, the old-fashioned keyboards also worked when they weren't hard up against the screen - thank god we have these new 'smart' keyboards that have to lie flush against the screen to work.
How was it mindblowing? I thought it was a pretty poor example of what AR can do. It was just a 3d game on a tabletop. Pokemon Go is a lot better, and it's pretty basic.
Having designed an AR experience for children and doing quite a bit of research into this problem a few years ago... You incorporate thumb controls on the sides of the device where you would naturally hold it. You use physical movement of the tablet to move things in the environment. Its limiting, but it works and can allow for fun interaction with the environment.
I wasn't aware of that, what format does Google use for its live streams?
I know that YouTube on Chrome uses VP9, which is not a web standard - or at least not more of a web standard than HTTP Live Streaming (which is what Apple uses for its keynote).
You're right, I am legitimately confused. If there are no web standard video encodings, is it therefore wrong to say that "Google uses web standards for their video?" What exactly is Google doing better?
YouTube uses MPEG-DASH as its adaptive streaming protocol, which works over standard web protocols with royalty free codecs and containers, unlike HLS.
As for why there isn't a web standard video encoding and container, all the major browser vendors except one have announced support for VP9 and its successor and Webm.
I get that you don't want to use Safari, but actually "forcing us to use Safari" is exactly what Google did. Google chose not to implement HTTP Live Streaming, and therefore it's not available in Chrome. Microsoft made the other choice, which is why it works in Edge.
The double standard here is really quite breathtaking. Apple doesn't implement a feature: Apple's fault. Google doesn't implement a feature: Apple's fault too.
HLS requires the MPEG2-TS container format, which is not royalty free and therefore has no chance of becoming a web standard or of being implemented by Mozilla on free platforms.
I don't see the double standard. All but one browser on GP's platform support the royalty free option. Only Safari on his platform supports the encumbered option, and he doesn't want to be forced to use it. On my own preferred platform, no browsers support HLS.
Since the votes on this thread won't let me reply:
@pjmlp: I choose to use a platform where developer experience is the primary goal because developing software is both my vocation and avocation and why I am reading "Hacker News." If I wanted a walled garden media consumption toy, I would get a LeapPad. Look, we can snark all day, but that doesn't change the conclusion that tommoor was right about web standards.
@millstone: There are no royalty-required formats that are web standards for a reason: to make the web free and open to all. That is why it is not a red herring. Bringing Google into it, on the other hand, is — whether Google pays the fees has no bearing on whether the format should be a web standard. Apple chose not to implement hardware decoding for the unencumbered format. All other major consumer hardware manufacturers have.
Not that I like HLS even a tiny bit, but even if MPEG2-TS was patent encumbered (and I'm not sure it is) HLS now supports fragmented MP4 just like DASH (this was done so that content providers don't need to store two different muxings of the video for DASH and HLS).
Apple, as usual, wrote their own private format. A pretty sucky one at that. But don't get fooled, DASH is not much better, and for the few things it offers over HLS it comes with a massive implementation difficulty. There is basically no DASH compliant player around, not even the one developed by the DASH Industry Forum implements all the standard. It would be way easier for every browser vendor to support HLS than DASH.
"Royalty free" seems like a red herring. I'm not familiar with MPEG2-TS in particular, but Chrome already happily plays H.264 on macOS, plus there is an explicit exception in H.264 license for non-paid (including ad-supported) content. I don't see how implementing HLS would increase Google's licensing cost in any way.
I appreciate the free software position. It seems MPEG-DASH is indeed better suited for it, though maybe only slightly [1]. (Regardless, turns out Apple did not release anything as copyleft this year, so maybe it's better that free software purists could not have watched the video?)
The case for "encumbered" options is simple: it's what's decoded in hardware, for users who prefer their device to last the entire video.
If Apple used fMP4 for this video, that gives them no excuse not to support other browsers. They can just ship a JavaScript HLS client built on the MSE web standard on the web page.
It is a Apple developer conference for software developers that care about making beautiful applications that take full advantage of native experiences on Apple platforms.
Any developer on this community can watch the video.
I can still be an outsider and be genuinely interested in what their platform has to offer and, maybe, later decide that I want to invest in their platform. I become a better developer, Apple gains a new developer, everyone wins.
Forcing people to use macOS to view recorded sessions or events goes against that.
However, the default experience is a little box that says "Streaming is available in Safari and the WWDC app". Now I have to go out of my way to learn that they stream via HLS, obtain the stream's URL and feed it to VLC. †
Compare this with Microsoft's and Google's videos (available through Channel9 and YouTube, respectively) that are accessible "virtually" from every operating system and device.
If your goal is to attract new developers to the platform, maybe adopting a more widespread industry standard (such as DASH, which all other browsers implement) is probably the way to go, IMO.
--
† By the way, the box also breaks the "Copy Video Location" menu item, so I have to open the inspector or install an extension to find out what the real URL is.
It does seem like a lost opportunity to do some outreach though. It's not hard to provide streams both in HLS and DASH, and streaming the developer conference to everyone could surely attract new developers to their platform (and not only developers, the same thing applies to presentations of new devices and consumer software).
To me it really looks like a statement against people without macs; Apple doesn't care about them, not even in its own interest of making money. They won't speak to you as long as you don't have an iDevice, full stop.
All computer ecosystems up to the late 90's were like that, the only thing special about Apple is that they are the surviving ones from those days.
It were IBM "mistakes" that made the PC different from all other computer eco-systems, however the current trends of commodity hardware and race to bottom prices are making PC OEMs go back to the 80-90's full integration of hardware/software culture.