Blue Shield of California wanted to improve the Find a Doctor experience to make it easier for users to find the right doctor.
I successfully led a special project team that redesigned the doctor search experience which serves approximately 1.3M unique users per year.
My role was as a UX Design Lead managing the work of a UX Designer, a Visual Designer and a UX Researcher.
Additionally, we partnered with a UX Researcher and several Product Owners encompassing a timeframe of just under two months.
Challenge
The FAD tool is the single most trafficked area of the Blue Shield of California web site with upwards of 1.3M unique users per year, and as such, it needed to better deliver on the promise of allowing users to successfully complete desired search tasks.
The project consisted of upgrading the FAD tool with required feature enhancements for compliance reasons and to improve the overall experience. Blue Shield Digital Product Owners were tasked with increasing aggregated tool adoption through support for expanded use cases, specifically for members new to Blue Shield.
A Plan to Get There
The project plan was to address functional, back‑end and user interface improvements based on usage data, user feedback and the overall level of feature maturity. Given the mandate to move quickly and secure a “quick win” we posited that targeting the information and functionality displayed on the Provider Profile page was the best place to focus our collective efforts.
Evaluating the Pieces
After much discussion and debate, we agreed on an approach that facilitated iteration and allowed for feedback, corrections and enhancements, with the objective of minimizing wasted effort and rework.
As the FAD tool is large, complex and supports numerous use cases, defining initial scope for this project was critical. We needed to determine what we wanted (and would be able) to address given the technical and scheduling constraints of the project. All documented improvements were evaluated, vetted and prioritized resulting in a smaller, workable scope for the project. It was confirmed that initial project scope would be limited to the Provider Profile page.
The Framework
Users want to accomplish a task
At this point we paused and pondered if we might benefit from leveraging the Jobs–to–Be–Done framework by applying the behavioral concept of answering the simple question:
Q: What is the job(s) a person is hiring the FAD tool to do?
A: To locate a doctor and secure an appointment to ensure a positive health care outcome.
A: To be able to select a doctor to be my Primary Care Provider (PCP) and partner in my care to ensure a positive health care outcome.
Describing the problem through the lens of this framework allowed us to both distill and better clarify how users interact with this tool. This insight served as a “North Star” in terms of guiding our decision making and general design direction.
Research
Discover
As a collective group we all brought our opinions to the project, yet we eagerly wanted to challenge our assumptions and remained committed to being data‑driven in our decision making.
We knew research held the key to unlocking these insights. For the research portion of the project, it was decided we would conduct research for the overall FAD tool initially, and then focus our efforts on the Provider Profile area of the tool.
Leadership Highlight
There was discussion regarding how comprehensive the research should be. I challenged everyone to think more holistically, and subsequently guided the team to expand the scope of the research. It was critical to our work to understand how users interact with the entire Find a Doctor tool, not just a portion of it.
A Logical First Step
As part of the current state discovery we needed to establish a baseline, so we evaluated the current state of the tool. The standard user flow is typically comprised of a three‑page process. Search parameters are configured with the selection of a provider search type along with one or more specialties on the initial screen.
The resultant screen displays the executed search results with the final screen showing the specific provider profile information. The current layout was disjointed, out of mandated compliance, and presented information in a confusing manner. It should be noted the baseline audit revealed data accuracy was a major area of concern but was deemed to be out of scope for this initiative.
An Important Balance
We next looked at better understanding Business Objectives and User Needs and how we might achieve balance between them. We remained mindful of the following considerations:
User considerations
To be able to understand and easily find relevant provider data
To be able to easily locate timely and accurate provider data
To be able to achieve overall successful task completion for relevant use cases
Business considerations
To remain within mandated compliance
To be able to satisfy product performance objectives
To be able to meet scheduling objectives
To fall within budgetary constraints
Technical considerations
To remain within the bounds of technical feasibility
To avoid violating security protocols
Listening to Users
We embarked on several rounds of user research to better understand overall attitudes, user knowledge, behavioral habits and expectations both in a general sense regarding health insurance and specifically regarding the FAD tool.
Each progressive phase of research built upon the previous findings as we sought to gain deeper insights and understanding of user needs and desires.
We were seeking insights in the following areas:
What, in the current design, resonated with users (and what confused them)?
What data elements should be displayed?
What terminology and syntax should be used?
How should the information be structured?
Design–Test–Build
Define
In conjunction with user research we sought to incorporate data and analytics as well. We summarized user feedback and synthesized it along with user data and analytics.
While this was a time‑consuming step, it was crucial for the understanding of what data elements we currently displayed and what new data elements that users (or mandates) indicated were needed.
We worked closely with Product Owners and our IT partners to gain clarity around what data elements already existed in our service architecture and what new data elements would need to be captured and ultimately displayed.
Others Are Trying to Solve this Same Problem
We looked at some of our competitors in Health Insurance including Aetna, United Health Care, Kaiser, Oscar and others. In collaboration with the entire project team we began to assess the concepts and ideas that inspired us and determined which approach we wanted to pursue further.
During this phase we were careful to ensure that the direction in which we were heading was complimentary to both the technical limitations that were in place as well as remaining faithful to the user insights that were revealed.
Design–Test–Build
Design
After spending significant time completing the Research phase of the process, it was time to enter the next phase. Design solutions were explored iteratively, and once completed, more detailed user feedback was sought via user testing.
Sketches were created, refined and presented to people to begin selecting the most promising approach while simultaneously providing an opportunity to validate the overall design concept.
Following the Users
New and existing user flows were investigated and documented. We worked through the flows that were in scope for this portion of the project and noted those flows that were outside of the established scope which would be addressed in a future project.
Design Sketching
Sketching commenced and after further synthesis we decided upon two approaches, both of which satisfied the design criteria that had been established but were different enough to allow users two distinct options from which to choose. We presented:
A centered layout on a “floating” panel.
A left-justified layout with borderless “white-space”.
Low‑Fidelity Wireframes
Sketches were converted to low‑fidelity wireframes in preparation for user testing sessions. For purposes of testing, specific data details were purposely omitted so as to assess layout and page structure only – these details would be inserted and refined later and tested as part of an iterative approach.
User Testing – Iteration 01
To test the layouts, several in‑person testing sessions were performed. A questionnaire was completed as well as capturing extemporaneous feedback and impressions on the layout and structure of both design options. Users preferred the centered layout but liked the left‑justified text treatment. Page length and scrolling posed no issue for users. Displaying a photo of the doctor was a unanimous request from almost all test subjects.
Specify
Deliver
Insights revealed from initial rounds of design and testing allowed us to continue to move the solution forward.
We refined the designs and created a second round of iterations as we prepared for another session of user testing.
High‑Fidelity Wireframes
After the initial rounds of testing were completed using low‑fidelity wireframes, we created high‑fidelity wireframes in preparation for the next round of testing. In these subsequent rounds, specific data details were inserted and tested as part of an iterative approach.
User Testing – Iteration 02
To test the layouts, more in‑person testing sessions were performed. An additional questionnaire was completed as well as capturing user feedback and impressions on the centered layout version of the page. In addition to a doctor photo, Ratings and Reviews was a frequently requested option by most respondents.
Final Design
At the conclusion of user testing rounds we synthesized the user feedback along with the data and arrived at the final design layout. For the next step, the visual design was applied. A newly revamped design system was implemented to ensure site consistency, but with minor modifications. This allowed for backward compatibility with the existing FAD pages that were out of scope for this project.
This is the proposed redesign of the Provider Profile page.
Leadership Highlight
With project deliverables complete, team members were disappointed certain features and improvements were not included. I counseled the team regarding the incremental nature of our work and our role in continuous improvement. I assured them the enhancements we made were not insignificant, and our efforts would further our customers in achieving their objectives.
Specifications
Once the final design had been tested and approved, the next step was the creation of detailed specification documents for use by the Development team.
We were able to leverage design software to assist in the creation of the annotated mock-ups that would be used by our IT partners for development and coding purposes.
Impact
Noticeable improvements (and more to do)
While the redesigned Provider Profile page has yet to be deployed, there were many lessons and observations. We expect to collect usage data as well as quality survey feedback once the page goes live.
With a focus on being data‑driven — for the Provider Profile landing page — we were successfully able to meet the following design objectives (that were documented via the research and data analysis steps):
More open white‑space
More prominent positioning of summary information in the header portion of the page
Better delineation of sections with clearly defined section titles
Improved presentation of ratings and reviews
Display a photograph of the doctor
Improved map interactions
Improved ability to see "Accepting new patients" status, with fewer steps
Satisfied mandates where required
Met scheduling objectives
Learning
We challenged our assumptions and there is more to do
While we were pleased with the progress we made given the scheduling and technical restrictions we encountered, we remain hopeful that we can continue to achieve more in subsequent phases of the project.
The project was budgeted and scoped for only a portion of the overall FAD tool, but we made the decision to conduct our research more comprehensively for the entire tool. In hindsight, allocating so much of the schedule to these tasks was not worthwhile. The time devoted to holistic research devoured most of the working schedule and many of the insights revealed were not actionable as they were out of scope for the approved portion of the project. Ideally, at the outset, we should have advocated more actively to expand scope and schedule to be at the overall holistic tool level, and not accepted a piecemeal approach to product enhancements.
The technical issues associated with the data elements caught us by surprise. We eventually realized this was not a design issue, thus design alone could not solve it. Data integrity and availability became a challenging topic as there was concern that any level of discussion could be misconstrued as a criticism of the IT group's level of competency. The dynamics surrounding issue underscored for me the value of not only working to achieve deep collaboration with our working partners, but doing so with trust.