The previously best known algorithm for deciding if a given edge could be added to a planar n-node graph, while keeping it planar, took time O(sqrt(n)).
The new algorithm needs time only O((log n)^3), an exponential speedup.
> By considering a lot of cases, find O(1) candidates to the flip nearest u that improves the embedding.
Also a comment underneath by one of the authors:
> We have not tested it (yet), but I suspect the current version will be too slow in practice due to the ~400 cases it has to check for each flip. Also, some of the core data structures we use do not yet have a good practical implementation. I do plan on making an implementation at some point, but that could be months or years away.
It's somewhat meaningless to compute exact cross-overs for big-O complexities without the constant factors. That is, the actual times for the algorithms are C1 * sqrt(n) + ... and C2 * log(n)^3 + ..., where the ... are lower-order terms. The exact ratio of C1 and C2 (and even the lower-order terms) can have a large impact on the crossover:
Of course the constant terms matter, but sqrt(N) < log2(N)^3 up until billions of nodes. For 2*32 nodes sqrt(N) is 2^16 and log2(N)^3 is 2^15. We deal with graphs of that size in electronic design automation, and planar graphs don't need metal crossovers, but we aren't adding one connection at a time. So this result in itself isn't going to be terribly applicable, but may lead to further insights and development of improved graph algorithms.
I work with the researchers, and I think they'll be interested in knowing which version of the problem practitioners like yourself are most interested in. Do you ever add/remove edges, perhaps in bulk? Or do are you mostly interested in finding "best possible" layouts for static datasets?
Practical edge crossing minimization for large graphs (e.g. a billion nodes) is a really hard problem.
An alternative formulations that's seen some use in graph drawing & circuit design software is to describe the embedding problem as an optimization problem: find an assignment of coordinates to nodes such that the edges (line segments) don't cross.
It turns out that estimating a lower-bound on the # of edge crossings of an embedded graph is equivalent to solving an SVM-type problem, and one can characterize the optimal layout as the solution to a nonlinear optimization problem. What's typically done, though, is to alternate between finding the bound for a fixed embedding & optimizing the embedding with something like gradient descent.
It gets a bit more confusing once you realize ic netlists are hypergraphs, and edges correspond to sets of nodes, (e.g. the "embedding" of an edge of a set of nodes is sometimes modeled as a rectilinear steiner tree).
For edge crossing/congestion minimization, the authors are Shabeer "Edge crossing minimization", Spindler "Congestion driven placement/RUDY", RePlace for a sota academic force-directed placement algorithm.
I don't work on placement problems, though part of my work provides input for the placement problem (produce an optimized logic netlist that then has to be placed and routed). But if I can suggest something in the same ballpark: imagine that you have a large graph and you want to partition it into a small number of partitions (say, 4 to 10) of similar size. You want each partition to be a planar graph, and you want the number of interconnects between these graphs to be minimal. The graphs would correspond to metal layers on an ASIC. There's a lot of fuzziness in the way I stated it; to make it more rigorous a suitable cost function would need to be defined.
Someone correct me, but I think hMetis is one popular software/algorithm for multi-level graph/hypergraph partitioning. There is also KaHyPar which is a bit more academic.
I'm not current on this stuff at all, but I think planarity mattered more in the Mead-Conway era than today. Modern chips have lots of separate metallization layers so you can avoid crossings. Vias aren't free, but they exist so you can use them.
You're using natural log, for my quick-and-dirty I used log2, which gives a slightly higher crossover point. The exact crossover depends on the size of the constant term, of course.
Wholeheartedly agree.
Graph theory is probably one of the most relatable, accessible theories available.
It is a fantastic tool to discover algorithmics, as you can visually apply your thinking.
FWIW, I personally wrote my master thesis on small world graphs, (_ie_ Kevin Bacon, 6 degrees, that kind of graphs), building structures to add properties to such graphs, and it is, without a doubt, the best memory of my cursus at the university.
I also like graph theory. I don't think kids should have to learn it. It's much, much less useful than statistics, linear algebra, and calculus in everyday life or most university courses. Math in schools should, imho, be a mixture of mostly "how to lie with statistics" with a little preparation for engineering and science degrees. People need an intuition for statistical reasoning to understand the news and participate in democracies. It's about as important as being able to read complex texts. You need some calculus and linear algebra to understand basic physics. Most other areas of math are primarily interesting to Mathematicians and Computer Scientists.
Can’t find the link now, but I’ve seen a pedagogical framework for learning basic probability by manipulating dynamic graph-“walking”. Absolutely useable with kids.
Yeah graphs are often useful for modelling, but typically you need very little basic graph theory to use them for that purpose. Diestel [1] for example spends dozens of extremely dense pages just to introduce the basic terminology. To cover the contents just of the first six chapters (which I consider to be the basics for algorithmic applications of graphs) you'd probably fill most of the highschool math curriculum, even if you skip almost all proofs.
Yes, I actually had a graph theory session with 6 year olds ! We talked about graphs, graph coloring, eulerian paths, etc all with pencils and paper. It was a 'talk about your job to schoolchildren' day, and I thought they might relate to graphs.
It was so successful that all kindergarten teachers in the same school started doing it as well.
The unexpected part was the other parents, who looked at the poster the children had made about chromatic numbers complete with examples : quite sadly, they were a lot more scared than their children!
As someone who hated math and was really bad at it since calculus (mostly due to lack of rigor), yes. Once I got to algorithms class, I struggled at first but after a session with my (really excellent) professor, something “clicked”.
I really think graph theory and discrete math in general should be taught much earlier. Maybe even before (or alongside) calculus in high school. Calculus isn’t really a legitimate prerequisite in this case.
Specifically for planarity, it's pretty easy to convince yourself that K5 and K3,3 are minimally non-planar. You don't come out with a rigorous proof but you do get an intuitive feel. With a decent motivating example it's accessible to kids.
Knowledge graphs are so under-utilised for practical things despite the fact that there is significant amount of useful research out there. Their problem is that they are expensive. I hope further research in ML would help elevate their relevance significantly. I am particularly fond of using knowledge graphs and graph theory for security related work.