Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I think it warrants an explanation though. Why not DFS? Too hard to dedupe? What’s the branching factor and the duplication rate? If the algorithm is 50x faster due to lower overhead it might be worth it.


It would probably end up with similar memory requirements because it has to dedupe. Naively you would need to store all the visited states anyway just like with BFS.

You might see some interesting effects using a bloom filter and an IDDFS. It would turn the whole search probabilistic but might be fast enough that you could run it enough times to remove reasonable doubt.


The problem seems to have many early branches making deep searches much less useful.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: