At first, the results claimed here seem to clash with the fact that we know, for instance, that it is NP-hard to 5-color a graph which is promised to be 3-colorable [1]. Of course, the devil is in the details - on page 11, where they describe the "LOCAL" computation model they work in, they explain:
"In each round the processors may perform unbounded computations on their respective local state variables and subsequently exchange of messages of arbitrary size along the links given by the underlying input graph."
Given this setup, it should be possible to solve any problem in O(n) rounds, and indeed they point this out in the introduction:
"If the chromatic number of G is χ, in this setting it is trivial to find a χ-coloring in T = O(n) rounds, as in O(n) rounds all nodes can learn the full topology of their own connected component and they can locally find an optimal coloring by brute force without any further communication."
While this paper is very interesting from a theoretical perspective, it is saying more about the LOCAL model of distributed computing than it is about quantum computing. Nobody should come away from this with the conclusion that we have somehow proved that quantum algorithms are useless for real-world problems.
"In each round the processors may perform unbounded computations on their respective local state variables and subsequently exchange of messages of arbitrary size along the links given by the underlying input graph."
Given this setup, it should be possible to solve any problem in O(n) rounds, and indeed they point this out in the introduction:
"If the chromatic number of G is χ, in this setting it is trivial to find a χ-coloring in T = O(n) rounds, as in O(n) rounds all nodes can learn the full topology of their own connected component and they can locally find an optimal coloring by brute force without any further communication."
While this paper is very interesting from a theoretical perspective, it is saying more about the LOCAL model of distributed computing than it is about quantum computing. Nobody should come away from this with the conclusion that we have somehow proved that quantum algorithms are useless for real-world problems.
[1] https://arxiv.org/abs/1811.00970