So it turns out that you can do better than *either* a priority queue or a Fibonacci heap. I wasn't able to find details of Hagerup's Algorithm or any of several similar things that seemed like they'd be useful, but I was able to re-deduce them...
It turns out that in the case I'm dealing with, the path lengths are all integers. Better yet, they're all small integers. In the trivial starting case, they're all 1. So by having two lists -- nodes I'm working on and nodes I'm about to work on -- and a variable in each node, I can get a linear-time single-source shortest path algorithm. Linear, in this case, means O(V + E), and in my case E is always about equal to 3 * V. So it's O(V), better than I can do with Dijkstra.
Say I just iterate through various path lengths, in order, a little like how Dijkstra would traverse it -- I start with a list of nodes that are one jump away from the start node. I use that to build the list of nodes that are *two* jumps away. I use that to build the list of nodes that are *three* jumps away... Every time I add a node to the list, I mark it as "off limits" so that no later (less desirable) list will add it back.
Boom! That's the algorithm. I can add a couple of additional speedups since I know that the graph diameter (that's the thing I'm looking for) is between the length of the longest shortest path and that length plus the length of the second-longest shortest path, which may let me cut off early in some cases. Yay! (Ignore this note if you don't understand it, it's an optimization and not a great one).
But best of all, I now have an algorithm which is either the same as the best I know of (probably) or at worst, equal to its runtime. Even though I couldn't find documentation on it.
So I'm not provably optimal, but I *am* state-of-the-art.