Sorry to nitpick, but for a good Bayesian, absence if evidence is evidence of absence. If you want the aphorism to be technically correct, you should say "absence of proof is not proof of absence".
A note on the terminology: "evidence" is a piece of data that suggests a conclusion, while not being conclusive by itself. Whereas "proof" is a piece of data that is conclusive by itself.
For a long time my wife refused to accept that Tree Kangaroos existed and insisted that I'd made them up. When the internet came along she looked them up and treated me strangely for a while.
What things that you have never seen do you not believe in?
(not the OP) Giant isopods. They're not real. I know there are pictures of what are supposed to be giant isopods but they are not real animals, instead they're clearly fake models of made-up animals.
Are the plans for typed algebraic effects solidifying, or are they still nebulous? Concretely, are you willing to take a guess as to when we are expected to see OCaml 6? ;-)
Thanks for the reply. I hope that the array and list comprehensions land soon in upstream; it's a useful and hopefully not-too-controversial feature.
I'm more ambivalent regarding the local allocations and the unboxed types. I totally understand why they'd be useful when you are trying to squeeze every last drop of performance, but they do require a not-so-trivial complexification of the language.
The local types are less invasive than the full support for typed effects. In particular, they are opt-in and associated complexity is pay-as-you-go. In my initial experiments, they seemed pretty nice to program with.
The type system for algebraic effects is still in the research and design phase at this point.
Right now, I am not even taking a guess of what will be the defining new major features of OCaml 6 (effect system + modular implicits maybe? Maybe not?).
Thanks for the reply. I'm hoping that modular macros land soon. I'm very ambivalent about the PPX mechanism, and I hope that modular macros reduces the need of PPX.
Congratulations and a big thank you to the OCaml team! I hope that multicore support finally ticks all the requirement boxes that had prevented many from taking a serious look at OCaml. The language certainly deserves it: it hits that sweet spot between expressiveness, performance, and pragmatism like no other.
I have to interject here for the sake of those unfamiliar with OCaml and who may take the parent comment at face value.
Saying "it has quirks like using ;; to end statements" is misleading to the point of just being bogus. The double semi-colon is only ever used in the REPL. In fact, I've been programming OCaml for fun and professionally for over 15 years, and I've never used a double semi-colon in my code, nor have I ever encountered one in the "wild".
Indeed. Considering the volume of transactions going to and from Coinbase, their adoption of Segwit would go a long way towards alleviating the current mempool situation, which would bring lower fees for everyone.
As others have pointed out, your understanding is backwards. Mining is heavily centralised already (in China of all places!), and larger block sizes would only exacerbate the mining centralisation, which is probably why the biggest fans of larger blocks are the Chinese miners.
> Segwit isn't a a block size increase. It allows for a tiny amount more transactions but it's very clear it's not enough. An actual block size increase (say to 8MB) would solve the current problems.
But what about the new problems it would introduce? I run a full node and even now it eats a significant portion of my bandwidth. With even larger blocks I would probably drop off the network altogether. And I'm sure I'm not the only one out there in this situation. Therefore, I don't think Core developers are exaggerating in their concerns about the centralization pressure caused by overly large blocks.
Moreover, wouldn't increasing the block size simply kick the can down the road? One advantage of the current fee pressure is that it strongly encourages the development of 2nd layer solutions. There are right now at least three independent teams working on Lightning Network implementations and they seem to be making quick progress...
We're pretty good at disease control, see eg. the recent ebola outbreak. An outbreak of anything dangerous would be quickly contained. There are no organizations with the capacity to surreptitiously produce, distribute and release an infection agent broadly enough that it might kill in the billions before being contained.
That basically leaves the US or Russia (the only countries with the capacity to do so) arbitrarily deciding to nuke most cities in south Asia.
I'd give China more of a chance than Russia, actually. They have more resources. (But perhaps less willingness than Russia: China is doing very well with the status quo.)
I don't think China has the nuclear warhead stockpile to kill 1 billion+ people. Wikipedia says "Current stockpile (usable and not): ~260" which means each would have to kill 3.8 million, which basically means delivering each perfectly into a major city. That's a non-trivial operation to put it mildly.
That said, I don't see any objective (rational or otherwise) of either country that would be advanced by such an attack. And again, we're talking about an attack that will kill 1B+ people, not just any nuclear attack -- that's a different story. Still low probability, but at least there are semi-rational objectives that could motivate it.
Oh, I didn't mean to imply the Chinese had enough stockpiles. I meant they have a big enough economy to rapidly build up their nukes (or invest in bioweapons etc), should they decide to.
A note on the terminology: "evidence" is a piece of data that suggests a conclusion, while not being conclusive by itself. Whereas "proof" is a piece of data that is conclusive by itself.