Performance has nothing to do with it. You can imagine the application being a very specialised version of `wget`, and I like CLI system applications to be easy to distribute, with no dependencies and not requiring an entire VM to run.
Erlang/Elixir is good for networked servers. For CLI apps I either choose Go or Rust, and I much prefer the latter. Since I want to make this component open source, I wanted to keep it separate from the rest of the closed source, proprietary Elixir backend.
There is no wrong choice to be honest, this is mine for my startup, and I don't think I need more justification than "I am productive with it for the task at hand."
Rust also has the benefit of eliminating a ton of classes of bugs, which is great for the core of a product that connects to the network, fetches untrusted content, parses it, and (presumably) stores it somewhere.
Not that you can't still have bugs, but when you can get C-levels of speed with Python-levels of safety, why not?
A crawler should be limited by the network not CPU. Outside of the language making it easier to handle multiple concurrent connections, I doubt speed would be much of a consideration.
The Rust app is kept as a standalone application. Elixir spawns the process, passes it an internal URL where it can post its output, and forgets about it, since each run might take a few minutes to hours to complete.
But everything else, including scheduling the execution of this crawler, and parsing its output, is done on the Elixir monolith.