🍊
Orange Juice - ING Unified Design System

Product Design, Design Systems, Design Operations
Program Overview
The need for a Unified Design Systems in ING was already identified in 2018 as an enabler for convergence of 14 different user experiences from different countries. The program supports 4 technology platforms - Android, iOS, Web, Hybrid. The design system is built to speed up delivery and to allow ING to respond quickly to customer demand and market changes.

The Initial KPI of the program is to deliver:

1. 60+ components from different platforms;
2. Create a Portal that would house design and developer foundations and components;
3. Evaluate the needs of the consumers from 14 countries
Initiative: Evaluating Information Architecture
Orange Juice Portal was derived from a developer's Components Library site. It was a copy of ING Web  and Lion components website.

Organizing the Research Operations

I organized and launched ING's first cross-border user research. It was to challenge, validate and recalibrate the information architecture implemented at the time.

The Design team lacks expertise / skills in user research. I leveraged on our global governance, partnering with ING Spain and ING Netherland+Belgium's Research team, we built a tree test and usability test plan using UserZoom.

You can visit Lion here: https://lion-web.netlify.app/
Adopting the User Research Process in Unified Design Systems (UDS)
Side story for the initiative
Since UDS is still fresh right off the oven, the team still has zero capability nor skill nor maturity on leveraging on user research. Finding the right sponsor for this initiative is not a challenge, but the resistance of the team to run the analysis seemed like a criticism to their initial work, and thus became a hurdle. Finding external allies, who are experts on the field are helpful in easing the discovery process in the team. I also had to seek for every countries support to finally add this in the backlog and get it running.
Once we added the initiative in the sprint, we had to scope the version of the portal that needs to be evaluated. The team decided to create a separate branch in git to preserve the version we want to evaluate.

How the test was organized:

1. Defined the problem & scope
Aimed to validate the information architecture of the online portal

2. Choose artifacts and techniques

Applied quantitative and qualitative research by tree testing

3. Define sample (size & profile)
Invited +100 participants including engineers and designers from different ING countries

4. Launch (field phase)
Tested dry-run with stakeholders
Launched the tree test

5. Analyse results & define next steps
Applied co-analysis with Orange Juice team
Applied design iterations on sprint backlog

Tree Test Questionnaires

The tree test is only 8 questions, mixed of qualitative and quantitative tasks. The first 3 questions is aimed to profile the participants. We also added a fake task to filter out participants who are not engaged in the study.
Each questionnaires are followed up with rating the ease of task completion. If the task was easily completed, the participant needs to explain why they think it was correct. If it is difficult, the participant needs to explain the context of the difficulty.

Task 4 to 11 focuses entirely on how the information is structured in the following pages in the portal:
1. Getting Started page
2. Typography Page
3. Sketch Elements - How to use the components
4. Component Page (Side navigation)
5. Design Tokens page
6. Colors (Foundations) page

Important notes:
- There is no time limit
- The test should be statistically valid

Socializing the results within UDS

It's very important to share/communicate the results with the team. This is to reach the maximum impact of the study -- by getting them involved in the initiative, we can easily add the points/insights from the studies to our backlog. We call this session - "Co-Analysis" with the stakeholders.

The co-analysis was held/time-boxed for 2 hours; first half is walking them through the results, second half is getting their understanding of the results and adding it to the backlog.

The way the pre-analysis worked is that each result per questionnaire was clustered / categorized with a brief summary. Each person in the team will have their own airtime in getting their thoughts heard.