Validator uptime and stability reports for Story (beta)

STAKEME team launched a new tool that helps validators clearly demonstrate their real uptime and on-chain performance, and makes validator evaluation easier for the community and the Story Foundation team during delegation rounds and application reviews.

With this tool, validators can generate public, shareable performance reports that include:

• validator uptime and stability over time

• signed and missed blocks (based on real on-chain data)

• overall performance indicators that reflect how reliably the node is operating

• clear, easy-to-read metrics for validators, delegators, and ecosystem teams

Link: https://trackval.storyscan.app/

Why we built this.

Right now, it’s difficult to objectively compare validators or quickly verify claims about uptime and reliability.

Application forms and delegation processes often rely on self-reported data, scattered screenshots from different dashboards, and manual checks.

Instead of sending random screenshots, validators can now simply share one clean report link that shows their actual performance.

How to use it:

  1. Find your validator in the explorer
  2. Click Generate report
  3. Share the report link or a screenshot

Current status

This feature is currently in beta testing, and your feedback is very important to us.

If you notice anything unclear or missing, or if you have ideas for additional metrics that would be useful, please let us know. We plan to actively improve the tool based on community feedback.

Our goal is to support a healthy, fair, and transparent validator ecosystem and make high-quality operators more visible and easier to evaluate. Thank you for your attention.

6 Likes

Any way you can generate report on a weekly basis and post it on the forum?

1 Like

We’ll think about how to do this better. :saluting_face:

Thanks for this! Looks helpful

https://trackval.storyscan.app/report/cryptomolot-1768677792172

2 Likes

Hi @STAKEME team :waving_hand:

Proposal to review the calculation of avg uptime in reports only for active sets. Since currently the calculation is performed for the entire history. For example, a validator was only active 1 day out of 7, out of 20,000 blocks it missed only 5, its real uptime is 99.99%. But if you make a report for 7 days, avg ~ 14%. Therefore, I think this significantly harms the reputation of validators, in case a regular user visits the page, seeing an uptime of 15% in the latest reports.

Overall, the UI looks great.

2 Likes

First impression - looks great!

I just have some concerns about accuracy, more specifically the number of scanned blocks. We ran a report using your system, and over the last 6 months it indicated fewer than 4M total blocks, which according to our tooling is much lower than it should be. Our internal result is around 6.5M blocks. This is more or less aligned with what was shared in another topic:

“Story’s original design targeted 20 million IP tokens emitted annually, based on an expected 10,368,000 blocks per year. Engineering optimizations have improved block production to approximately 13,140,000 blocks per year.”

Where might these discrepancies be coming from?

Other than that, I think it’s a great contribution to the ecosystem - thank you!