I haven't run it yet as I am travelling, so this is speculation based on my interpretation of the eocs. The documentation seems to suggest that some optimizations are implemented using maple and that the software could be run without it albeit without those optimizations.
It doesn't require a Maple license exactly. Maple is just needed to use the simplifier. It is perfectly possible to write and run Hakaru programs without having Maple installed.
I occasionally use probabilistic programming systems, and find this project fascinating, but I've been wondering for a while what the vision is.
Is it meant mainly as a research project/proof of concept (seems I've seen elsewhere that it's funded by the DARPA PPAML project), or is it intended to become a commonly used piece of software with a community like pymc3 and stan?
The PPAML project yielded a lot of cool work, I think that many people are still figuring out the most impact use case. I believe that the most mature new production language is Figaro, out of Charles River. They have some good example projects and a introductory text for their language.
I think PyMC is a much more mature solution. Also while PyMC is much more focused on sampling, Hakaru is more focused on Bayesian inference and trying to represent stochastic models in a way such that any inference algorithm could be applied to it.
That's the concrete syntax for the language. It was chosen to be make the language more familiar to people who do machine learning in Python. The embedded design we had made it very challenging to develop new inference algorithms and to combine them. You can still find that version of hakaru at https://github.com/zaxtax/hakaru-old
I expect no significant challenges in porting to more recent version of GHC. Some version bounds will need to be loosened and an Applicative instance added for the Measure monads. Also if you have any feature requests for that version I'd be curious what they were.