It wouldn’t be Google’s PageRank. All of the other companies that run crawlers already have their own metrics (which are probably pretty robust in the meantime), so it would basically just be “Yet Another Self-Made Link Metric”. People already have enough metrics to look at when buying / selling links (and often they don’t even know what those older metrics do, or how they’re calculated), another one isn’t going to change any of that.
Playing with big data sets can be fun, trying things with PageRank & other graph calculations can be fun, but none of this is really useful compared to taking that time to make a better website. If anyone reading this is keen on building something like this, imo, go off and build it – you will learn a lot along the way, and that knowledge & experience will be useful. The numbers that come out, not so much, but what you learn along the way will be.
This is no different than the other made up metrics. The problem with all of these is that it’s not difficult to come up with a rank system that matches up with publicly available SERP data. The problem comes when people use that metric to optimize a site. Then you end up with a post asking why their site with far better content and higher DA is outranked by a site with crappy content and lower DA.
It’s not a rank signal. It’s not a good predictor of actual SERPs. It may have some usefulness as a measure of progress, but even that is fairly irrelevant since the best measure of progress is your actual rank.
All of these made up metrics are marketing gimmicks that are good for selling a product.