Skip to main content

10. Define what success looks like and publish performance data

Work out what success looks like for your service and identify metrics which will tell you what’s working and what can be improved, combined with user research.

Identify metrics which will indicate how well the service is solving the problem it’s meant to solve, and track performance against them

  • For operators, the absolute key metric would be: have satellite operators used Monitor Space Hazards to change their behaviour, ie either to move or to not move a satellite, which they wouldn’t have done without it. We have not yet had an instance of this but are listening out via our user research sessions!
    • This is more likely to occur when API usage to OAs is working
  • There are sub-metrics which we are using to track whether or not they are in the position to make these decisions:
    • Are they getting data in a timely manner and a comprehensible format? YES
    • Are they getting UKSA OA data? YES
    • Are they getting notifications which are tuned to their preferences? YES
    • Are they getting data to allow them to analyse the relative safety of their orbits? NO - will require further work
  • At private beta, several performance metrics will be assessed and user research will be used to identify our main KPIs. The KPIs we have chosen to focus on can be seen in our KPI design documentation.
  • During public beta, we have created more endpoints to ensure we are tracking all the rich and useful data that we can, helping us to measure the effectiveness of Monitor Space Hazards. These endpoints include:
    • Number of object tracked
    • Number of high probability events
    • Number of notifications sent
    • Volume of analysis generated
  • We are also in the process of defining appropriate metrics to assess the success of the service providing escalated conjunction and re-entry alerts to government users.

Use performance data to make decisions about how to fix problems and improve the service

Monitor Space Hazards uses the third-party service providers Logit and GOV.UK PaaS tools and to monitor the health of the service:

  • Logit is used to log and monitor errors or system crashes. It is an observability service for cloud-scale applications, providing monitoring of servers, databases, tools, and services, through a SaaS-based data analytics platform. It takes in feeds from GOV.UK PaaS
    • All user activity has been analysed for risk/ potential bad actor behaviour, and is logged in Logit.io
  • Piwik Pro will be used to track front-end analytics. It tracks and reports website traffic. It will track user-related metrics.

Monitor Space Hazards also features a feedback page, linked from the Beta banner. Users are asked, “Overall, how did you feel about the service you received today?” and are given the option to provide more details.

Central government services must also publish data on the mandatory key performance indicators (KPIs).

  • Note, this service is not a transactional service and so we cannot report on the mandatory KPIs requested by GDS. However, we will assess similar/comparative KPIs. Data will be published for these KPIs at public beta.
This page was last reviewed on 7 December 2022. It needs to be reviewed again on 7 June 2023 .
This page was set to be reviewed before 7 June 2023. This might mean the content is out of date.