Applying the Service Standard
Version 1.0
Guidance on how to apply the Service Standard when delivering a service
1. Introduction
This guide explains how to apply the Service Standard during the development of a government digital service. It’s designed to be simple, practical, and focused, so teams can easily find what they need at each phase and move forward.
The guide follows the core phases of service development:
- Discovery
- Discovery & Design
- Design
- Build/Develop
Each phase includes:
- What to do
- Responsibilities (with clear task descriptions)
- Deliverables for the phase
Throughout this guide, you’ll see responsibilities split across three roles:
- Service Provider (SP) – the team building and designing the digital service
- Service Owner (SO) – the team responsible for the service’s purpose, scope, and approval
- Contracting Authority (CA) – the organisation funding or commissioning the service, often managing timelines and compliance. Usually, a Project Manager from the Department of Information Technology Services.
Download this guidance document
Teams should use the Digital Toolbox throughout the process. It includes tools, templates, and resources that help accelerate development and keep services aligned with DSF standards.
Everything in the Toolbox is built to support services that are secure, accessible, consistent, and easy for people to use.
2. Discovery Phase
2.1 Understand User Needs
Goal: Understand the problem to solve and who you’re solving it for.
2.2 What to do
- Create basic user profiles – SP, SO
- Plan and organise user research sessions, including recruiting participants – SP, SO
- Collect and evaluate existing user feedback (e.g. complaints, calls, emails) – SP
- Interview or observe real users – SP
- Map the current user journey and identify pain points – SP
- Define the main user needs and write a problem statement – SP
- Write a clear service goal – SP
2.3 Responsibilities
2.3.1 Service Provider (SP)
| Task | Description |
|---|---|
| Create user profiles | Summarise who your eligible users are and what they need to do. |
| Interview or observe real users | Run interviews, surveys, or observations to understand their needs. |
| Collect and evaluate existing feedback | Use complaints, phone calls, or other data the Service Owner (SO) already has. |
| Map the current journey | Show how users go through the service today and where they get stuck. Also speak to internal business users to hear their feedback and pain points. |
| Define user needs and problem statement | Write what users need, what’s not working, and the core issue. For example, “Citizens who change their address must contact multiple government departments separately. This leads to delays, confusion, and inconsistent records. There is no single, user-friendly way to update an address across services.” |
| Write the service goal | Describe what the service purpose is and what is trying to achieve. |
2.3.2 Service Owner (SO)
| Task | Description |
|---|---|
| Create user profiles | Work with the Service Provide on identifying who the eligible users are and what they need to do. |
| Provide user feedback sources | Share complaints, support calls, and other input that can help shape research |
| Recruit of participants for user research | Where needed, arrange participants for user research (minimum 5 per session). Sessions may take place during: – Discovery (If strong feedback already exists from other sources, this session may be skipped) – Design (prototype) – Staging testing – Pilot phase |
| Observe User Research sessions | Join the research sessions to hear directly from citizens. This helps for better understanding their needs, pain points, and how they interact with the service. |
| Define scope and goal | Set boundaries and clarify what the service is meant to deliver |
| Approve user needs and problem statement | Validate findings and confirm alignment with policy and department priorities |
| Align internal stakeholders | Bring in relevant teams and make sure they support the direction |
2.3.3 Contracting Authority (CA)
| Task | Description |
|---|---|
| Support access to users and stakeholders | Help the team reach people and departments they need to speak to |
| Secure resources | Confirm the budget, people, and time needed to support the work |
| Monitor progress | Track timelines and flag issues early |
2.4 Deliverables for this phase
- Summary of user research and key findings
- Problem statement and service goal
- User journey map
3. Discovery & Design Phase bridge
This phase bridges the gap between understanding the problem and starting the design work. It helps turn user insights and service goals into simplified, practical workflows that can be designed and built.
It focuses on removing unnecessary complexity, working with stakeholders to improve processes, and making sure the service is aligned with user needs before the detailed design begins.
3.1 Simplify and Redesign Process
Goal: Improve existing workflows before starting detailed design work.
3.2 What to do
- Map the current process from the user’s point of view – SP
- Identify pain points and areas of confusion – SP
- Co-design improved workflows with end users and stakeholders – SP, SO
- Simplify steps and remove what’s unnecessary – SP
- Document the redesigned process with clear diagrams – SP
- Get approval from the Service Owner (SO) – SO
3.3 Responsibilities
3.3.1 Service Provider (SP)
| Task | Description |
|---|---|
| Map the current process | Map the steps users take today. Use simple flow diagrams to highlight key actions. |
| Identify pain points | Highlight steps that cause delays, confusion, or frustration |
| Run co-design sessions | Involve users and stakeholders to shape a better process |
| Simplify the process | Remove steps or complexity that don’t add value. Apply the once-only principle. Don’t ask users to provide information that government systems already have. |
| Document the redesigned workflow | Create updated process maps or diagrams |
3.3.2 Service Owner (SO)
| Task | Description |
|---|---|
| Attend co-design sessions | Join the workshops to help improve existing processes. Your input helps align service changes with policy, business rules, and user needs. |
| Ensure alignment with service goals | Make sure the new process matches the service goal |
| Approve the redesigned process | Sign off before moving forward |
| Confirm internal team readiness | Make sure teams understand and can deliver the new improved workflow |
3.3.3 Contracting Authority (CA)
| Task | Description |
|---|---|
| Coordinate co-design and simplification efforts | Coordinate involvement of key people |
| Provide access to systems and data | Ensure the team has what it needs to understand and simplify processes |
| Support communication and compliance | Help communicate service changes and ensure they meet legal and policy requirements. |
3.4 Deliverables for this phase
- Redesigned process and improved workflow diagram
- Summary of how the process was improved
- Any necessary policy approvals
4. Design Phase
4.1 Prototype and Test
Goal: Explore and test different solutions before building the final service.
4.2 What to do
- Identify and map service flows and possible variations – SP
- Write test scenarios for each journey – SP, SO
- Provide a service domain name – SP, SO, CA
- Fill in and submit the Service Delivery Initiation Form – CA
- Build wireframes or clickable prototypes, preferably using the DSF Figma library – SP
- Test prototypes with real users – SP
- Improve designs based on user’s feedback – SP
- Document decisions and prioritise features – SP, SO
- Make sure all screens follow the Unified Design System – SP
- Submit the Service Assessment Request Form to schedule a Preliminary Assessment – CA
4.3 Responsibilities
4.3.1 Service Provider (SP)
| Task | Description |
|---|---|
| Map service flows and write test scenarios | Map how different users move through the service (flows) and create test scenarios to cover different ways users may use the service. |
| Build wireframes or clickable prototypes | You can create screens using the Figma DSF – gov.cy Unified Design System – v.3 |
| Test with users and gather feedback | Run usability tests with real or representative users using wireframes or clickable prototypes. Observe what works and what doesn’t, including whether the content is clear and understood, then update the design. |
| Iterate and improve the design | Use feedback to make the design clearer and easier |
| Provide a service domain name | In collaboration with the Service Owner (SO) and Contracting Authority (CA), provide the web address (for example, [service_name].service.gov.cy) that users will visit to access the digital service online. |
4.3.2 Service Owner (SO)
| Task | Description |
|---|---|
| Approve design direction and prototypes | Sign off on what will be built |
| Approve test scenarios | Make sure all user journeys are covered |
| Prioritise features | Choose what to build first based on user needs |
| Provide a service domain name | In collaboration with the Service Provider (SP) and Contracting Authority (CA), provide the web address (for example, [service_name].service.gov.cy) that users will visit to access the digital service online. |
4.3.3 Contracting Authority (CA)
| Task | Description |
|---|---|
| Complete the Service Delivery Initiation Form | Check readiness for the Preliminary assessment |
| Submit the Service Assessment Request Form to schedule a Preliminary Assessment | Send the completed form to DSF (email dsf-qa@dits.dmrid.gov.cy) to initiate the Preliminary Assessment process |
| Support testing logistics | Help provide or arrange test participants |
| Join sessions or give input if needed | Keep track of design progress and usability testing. Participate or provide input if needed. |
| Check readiness for the Preliminary assessment | Make sure all deliverables are complete |
| Provide a service domain name | In collaboration with the Service Owner (SO) and Service Provider (SP), provide the web address (for example, [service_name].service.gov.cy) that users will visit to access the digital service online. |
4.4 Deliverables for this phase
- Test scenarios and service flows
- User-tested prototypes (wireframes or clickable demos)
- Designs that follow the Unified Design System
- Documented feedback and design changes
- Approved list of prioritised features
- Tools used (e.g. Figma)
- Updated version of the process map (if changed)
- Service domain name confirmed
- Service Delivery Initiation Form
5. Build/Develop Phase
5.1 Build and Test the Service
Goal: Develop the full version of the service, the complete, functional build that can be introduced to real users with real data during the Pilot Phase. Test this version thoroughly to ensure all features, integrations, and data handling work as intended. Once testing is complete, prepare for the Final Assessment and Pilot launch.
5.2 What to do
- Build the service using components from the Unified Design System – SP
- Follow the DSF Onboarding and Deployment Guide – SP, CA
- Submit Service Architecture Document – CA
- Staging Deployment (stage 1 of DSF Onboarding and Deployment Guide) – SP
- Contact the CDS team (cds-support@dits.dmrid.gov.cy) to:
- Request CyLogin, CyConnect, CyNotify, CyPay (if needed)
- Exchange technical configs (API keys, callback URLs, etc.)
- Contact the CDS team (cds-support@dits.dmrid.gov.cy) to:
- Accept the GitHub repo invitation from DSF.
- Clone the repo and push your code to the main branch.
- Create a Dockerfile in the repo root.
- Prepare staging secrets file (e.g. .env, appsettings.json)
- Include Matomo IDs, CDS credentials, etc.
- Do not commit this to GitHub.
- Send it securely to DSF.
- Notify DSF that Steps 1-5 (stage 1 of DSF Onboarding and Deployment Guide) are done, at least 3-5 days before staging deploy.
- Wait for DSF to deploy to staging and confirm the app is live.
- Set up GitHub Actions for automated staging deployments.
- Push code updates to main for re-deployment to staging.
- Testing on the Staging environment
- Identify and create test users – SP, SO
- Run Functional Testing on the service in staging using test data – SP, SO
- Run Non-Functional Testing on the service in staging – SP
- Performance Testing (by tracking key performance indicators)
- Load and Stress Testing
- Security Testing (Penetration Testing)
- Accessibility Testing
- Conduct User Acceptance Testing (UAT) – SO
- Run user research on staging and apply improvements – SP
- Submit the Data Protection Impact Assessment (DPIA) – SO
- Define and implement KPIs and analytics – SP, SO
- Hold a Show & Tell session with stakeholders – SP
- Submit the Service Assessment Request Form to schedule the Final Assessment session – CA
5.3 Responsibilities
5.3.1 Service Provider (SP)
| Task | Description |
|---|---|
| Build using DSF Unified Design System | Use the Unified Design System and reusable components to ensure consistency, quick development and accessibility |
| Follow the DSF Onboarding and Deployment Guide | Set up user accounts on the CY Login test environment, and roles for all types of tests |
| Identify and create test users | Set up user accounts on CY Login test environment, and roles for all types of tests |
| Write and run functional, performance, and accessibility tests | Cover all key journeys with automated or manual testing to check quality and compliance. Simulate real user behaviour using non-production data and run full test cycles. Record all issues found and fix high-impact ones before moving to user research. |
| Run Non-Functional Testing on the service | Run Non-Functional Testing on the service in the staging environment. These include: – Performance Testing (by tracking key performance indicators) – Load and Stress Testing – Security Testing (Penetration Testing) – Accessibility Testing |
| Define and implement KPIs and analytics | Define and implement performance tracking based on service goals |
| Run user research on staging | Observe users interacting with the service in a near-live environment document findings, fix any issues and update the service before release |
| Support UAT | Coordinate with teams to define roles and organise testing |
| Hold a Show & Tell session with stakeholders | Demo progress and gather feedback from involved teams and decision-makers |
| Prepare Final Assessment documentation | Gather all test results, feedback, and required documentation |
5.3.2 Service Owner (SO)
| Task | Description |
|---|---|
| Confirm the service domain name | Collaborate with the Service Provider (SP) and Contracting Authority (CA) to confirm domain |
| Identify and create test users | Support Service Provider (SP) in preparing realistic test accounts |
| Run Functional Testing on the service in staging using test data | Participate in testing service behaviour |
| Conduct User Acceptance Testing (UAT) | Review and test the service from a user and business perspective to confirm it meets the agreed requirements |
| Submit the Data Protection Impact Assessment (DPIA) | Send the completed DPIA to the Commissioner. If it’s not needed, get written approval from the Commissioner to skip it. |
| Define KPIs and analytics | Define performance indicators with the Service Provider (SP) |
| Approve final changes | Sign off before requesting assessment |
| Attend Final Assessment session | Be present during the service assessment |
5.3.3 Contracting Authority (CA)
| Task | Description |
|---|---|
| Monitor delivery | Track progress and timelines |
| Help with integrations and access | Resolve technical dependencies |
| Follow the DSF Onboarding and Deployment Guide | Support technical setup and coordination |
| Submit Service Architecture Document | Provide technical overview and service context to the DSF Tech team by email: dsf-tech@dits.dmrid.gov.cy |
| Coordinate privacy and compliance reviews | Support DPIA and other checks |
| Obtain CY Login / Notification / Payment accounts | Coordinate with the CDS team at DITS via email to: cds-support@dits.dmrid.gov.cy to secure credentials. |
| Confirm the service domain name | Facilitate domain setup with the Service Provider (SP) and Service Owner (SO) |
| Submit the Service Assessment Request Form to schedule an assessment session | Send the completed form to DSF (email dsf-qa@dits.dmrid.gov.cy) to initiate the Final Assessment process |
| Check readiness for final assessment | Make sure all deliverables are complete |
5.4 Deliverables for this phase
- Working version of the service in staging
- Service Architecture Document (from the onboarding step)
- Service built using components from the Unified Design System
- Codebase uploaded to GitHub (with README and CHANGELOG files)
- Service secrets submitted to DSF Tech
- Service domain name confirmed and active
- API documentation, including the uniform resource identifiers (URIs)
- Matomo analytics configured and tracking key journeys
- Dockerised service containers
- Functional testing report (List of test scenarios and test users)
- Non-functional testing report, including:
- Performance testing
- Load and stress testing
- Security testing (e.g. penetration test results with no Medium or High issues)
- Accessibility testing
- Show & Tell session summary (optional but useful)
- User Acceptance Testing (UAT) summary by the Service Owner (SO)
- User research insights from staging
- Improvements made based on feedback
- Approved Data Protection Impact Assessment (DPIA)
- Defined KPIs and analytics setup
- Final approval from the Service Owner (SO)
- Service Assessment Request Form submitted
6. Final Assessment: What you need to prepare
The Final Assessment is your last checkpoint before the service is introduced to the public. The DSF Assessment Panel will review the full working service, assess it against all 15 principles of the Service Standard, and verify that it’s ready for pilot release.
To proceed, you must show that your service:
- Was built using DSF tools and design standards
- Meets user needs (with supporting research and feedback)
- Is secure, accessible, and functional
- Has documented how it will be monitored, maintained, and iterated
See the summary of the Final Assessment Checklist key items your team must have ready:
| What to Prepare | Why it’s Needed |
|---|---|
| Working version of the service in staging | To demo real user journeys from start to end |
| Test reports (functional, non-functional, accessibility, Penetration, WASA, etc.) | To prove that the service has been tested thoroughly and that issues have been resolved |
| Data Protection Impact Assessment (DPIA) | To confirm compliance with privacy and data regulations |
| User research findings | To verify that the service is officially cleared for public use |
| The design of the service is consistent with the Unified Design System | To confirm accessibility, consistency, and Government brand alignment |
| KPI setup and performance metrics | To demonstrate how success will be measured |
| Technical documentation (Service Architecture Document, README, CHANGELOG) | To confirm the service is maintainable and deployment-ready |
| Final approval from the Service Owner (SO) | To verify that the service is officially cleared for public use |
Once the service is approved to move into the Pilot Phase (controlled release), it can be deployed to the Production Environment. During this phase, selected real users will test the service using real data for a period agreed upon during the Final Assessment. If the pilot is successful, the service will receive the Verified Seal, and the final version will be published and officially announced on Gov.cy.