> 34:02 binary logs are not a bad thing as long
> 34:06 as you have the tools to pull them apart
Really? Did he just "as long as" the biggest downside of binary log compared to text log, that is way fewer tools work on a binary log due to its special encoding?
With the same logic, flu viruses are not a bad thing as long as you are immune to them.
Maybe best foreigner (non-Korean) Protoss player. The match that AlphaStar lost had actions restricted to a single screen. Before AlphaStar could see and control everything that was not in the fog of war simultaneously. Still quite impressive though.
I agree with you and would like to add that he is probably not the best non-korean Protoss either. The blog presenting AlphaStar itself only claims he is one of the worlds strongest players and link to this post: https://liquipedia.net/starcraft2/2018_StarCraft_II_World_Ch... showing Mana at 5th best non-korean Protoss players in the World Championship series.
AlphaStar has gotten critisism for it having unfair advantages. It was 5-0 against Mana when it could see and control the whole map at once for example. But after a camera-restriction was given (so it sees the map like humans) it lost 0-1.
With all this said, it is still impressive. Best bot we have by far in sc2!
Less than four years ago our state-of-the-art RL system (DQN) could only beat some Atari games. Now we can almost beat best human player in SC. That to me is very impressive.
Truth is not measured by the production of cognitive dissonance. "Cognitive dissonance" is not even a validated phenomena, it's just barely more than psychobabble.
If you break those PC models open what's really different? The hardware interfaces of PC has almost always been stable since the age of IBM PC clones. It might not be a monoculture in terms of all the RGB lightings you can put on your machine, but you definitely don't need to worry about different protocols. All the while Android phones can have the weirdest SoC's on the planet in them.
I admittedly have zero experience with configuring HW with ARM.
However, to say that HW interfaces on the PC have been stable since the age of IBM PC clones is a joke. In the DOS days, users manually had to manually set IO memory addresses and IRQ levels. Early Windows sat on top of DOS, so still had to do it there. Plug'n'Play didn't show up until Windows 95, and even then it was hit and miss. Some devices worked, others required manual configuration. RTM Win95 didn't support USB, either. That took the equivalent of a service pack (although, IIRC, they went by a different name back then. I want to say OSR1 added USB 1.1 support). Windows drivers didn't really get friendlier until the push to the NT kernel & it's HAL. Win2k had limited, but good support. WinXP got better. Vista was a step backwards. Win7, 8 and 10 have incrementally improved on Vista. Even on Windows 10, though, I have updates that "forget" a subset of my USB controllers. Windows doesn't know about drivers for my HOTAS. Most recent windows updates cause it to forget about my secondary monitor. Half the time after an update, my USB keyboard doesn't work (I have to login on my desktop using an on-screen keyboard to correct settings). I'm probably one of the only (or a small handful) of people that have a Geforce 690 & 1080 in the same system. Sure, PC might be better in that most peripherals go over USB. But, not all USB devices work with a generic driver, and Windows often doesn't include non-generic drivers for all but the most popular devices.
Let's ignore the historical legacy non-PNP ISA config nightmare, because they do not bring anything of value in this discussion.
The core system of PC is highly compatible with OSes in pretty much all directions. (Well now even if I don't ignore the non-PNP ISA configurations stuffs, it does not really change the situation: those were not for core system stuffs)
Then you have drivers for various bus controllers and peripheral, some of which are crucial for using your PC in practice, but as soon as the necessary driver exists, loaded by the kernel way after the core boot, and highly abstracted on all modern OSes (it only access the HW through functions abstracted by the OS)
For non-ancient PC, you even have with ACPI some abstracted functions, provided by the HW, and used at the runtime by the OS. It's to be considered as part of the core plateform, as if it was a pure HW interface, given what we are discussing about. The NT HAL, btw, is a vestigial of early NT years, and is of zero interest for the purpose of PC compatibility today (there is only one HAL that is in use on modern PCs, and IIRC switching the HAL was not even enough when multiple were in use IIRC, other MS binaries still needed to be recompiled -- so the NT HAL is merely an internal detail that bring no consequence in backward or fw compat as far as decoupling of binaries and their update of a partial subset in a system is concerned)
Now the situation for ARM SoC for Android is NOT the same. You just don't boot a generic ARM Linux kernel to driver your random ARM SoC of your random phone. Because even what could constitute an equivalent core system as what exists for PC, follows no standard.
You still need all of these things. An x64 Intel Processor manufactured in 2017 still boots into real mode at startup exactly like the original did 30 years ago. Complete with segmented mode without memory protection, multitasking, or code privilege levels, just as AT/AT compatible as the day it was born.
Thanks for reminding me of Plug'N'Pray. Some stuff worked but a lot didn't, likely the cheap rubbish hardware I bought with lame drivers. And trying to load a mouse driver for DOS and play a game that needed himem. Or trying the set the right IRQ to get the DOS game to believe you had an Adlib or SB16 compatible card....
The kids today don't know how easy things are with unified drivers for audio on Mac OS etc. I mean, even USB coming out was amazing instead of serial devices and guessing the COM port? Which LPT1?? And dial up etc etc so easy now.
All nouns are labels. Labeling in an argument goes all the ways and there is no guarantee whatsoever the parties involved have the same recognition of the labels. That we even have enough common understanding to create a language is due to a special human ability to understand each other without precise language, thus labels. The problem is not using labels, but people who intentionally mess with said ability to understand and then plant the blame on labels.
who's fault is that? there's plenty of articles out there, pick one or read a few and synthesize. Or go straight to the source: reddit.com/r/altright (warning: contains explicit white nationalism/anti-semitism)
For all who question "why not just copy history to new tab?": it is part of the trail design. By saying they are not throwing the tab away they have most probably already done that. This will be another screen to view the trail, like the tab exposé on mac safari.
It's still moving fast and breaking things in ways that---while necessary---aren't reasonable choices for programs like Emacs or TeX. These programs need to run the same way in 20 years, and be runnable the same way in 50.
> 34:02 binary logs are not a bad thing as long > 34:06 as you have the tools to pull them apart
Really? Did he just "as long as" the biggest downside of binary log compared to text log, that is way fewer tools work on a binary log due to its special encoding?
With the same logic, flu viruses are not a bad thing as long as you are immune to them.