Comparative Analysis of Accessible and Organized Content Designs
Problem
Inconsistent mobile experiences across tools caused user confusion and hindered utilization of similar features. The abundance of interface inconsistencies posed challenges for effective user engagement.
Solution
After thorough research and heuristic evaluation, we concluded that targeting specific mobile components would enhance design, development, and user experience.
Outcome
Despite not being rolled out yet, our internal team has provided positive feedback on the offered solutions and the ease of implementing fixes, eliminating the need for tool-specific adjustments.
1
Overview
Summary
During the course of the week, we have an assigned task of reviewing the HPE domain to identify areas that could be improved. We closely analyze the analytics to gauge the performance of these sections. If we come across experiences that are underperforming, our next step is to experiment with various designs and layouts, aiming to enhance user engagement and ensure easy navigation for users to find the information they seek.
HPE has multiple pages clickable boxes that lead to a pop-up carousel modal and I always thought that the carousel is all information pertaining to the one box I clicked, but it is not. I believe that this confuses users and causes them not to know where to go to find what they are looking for.
My idea is that if I clearly separate all the information and keep the content condensed, users will engage more and find what they are looking for. For this test, I decided to do an A/B test with two variants to see if users prefer to have all the information laid out at once, or have it more condensed and organized in a tabbed view. All breakpoints will be included.
We noticed a weird interaction on the page's main content that was hidden behind content blocks that when pressed, a carousel of six items came up.
Initially, it was assumed that each category would have its own distinct carousel of information relevant to the selected category. However, upon exploring and clicking through all the different categories, it became apparent that the carousel information remained consistent across all categories.
Upon delving into the analytics, it became evident that users were clicking on the categories; however, they were not actively scrolling through the content carousel.
It appears that users encountered this issue because, despite clicking on specific categories, they were still required to scroll through the carousel in order to access the content that directly related to their selection.
The information and videos were concealed behind multiple layers of clicks, making them more challenging for users to access.
This discrepancy poses a significant challenge for users in their quest to find the information they seek, as the carousel content fails to align with their intended selections.
The image on the left displays the original layout featuring four category boxes, accompanied by the interaction rank denoted in the top right corner of each box.
On the other hand, the image on the right showcases the carousel that appears when any of the category boxes are clicked, with the initial click consistently revealing the first carousel content.
2
Testing Scope
Prior to initiating any testing or new project, it is imperative to establish a clear testing scope. This ensures that we have a precise understanding of what aspects we intend to test and the specific actions we will undertake. Failing to define the testing scope may lead to distorted results, as the objectives and parameters are not clearly outlined. Moreover, having a defined scope aids in the design phase by preventing unnecessary over-designing of elements that are not required.
Hypothesis
The current layout of the overview is confusing to users because they click one content box thinking they will get information for only that, but instead they get a carousel containing all the other content as well. By organizing this information in a more obvious way, this will improve click-through rates
Measurement
This will be an A/B preference test with three variants (A/B/C) where we will measure clicks on components and respective CTAs.
Action after testing
Test on another page. Share results with UX to consider template update.
User story
As a personalization strategist, I want to test the presentation of information, so that we can see if one design encourages engagement more than the others.
Journey experience
As a visitor to the Machine Learning Ops page, I will see one of three experiences for the information presented under the section 'A CONTAINED-BASED SOLUTION FOR THE ML LIFECYCLE.' One will be the default, one will be a tabbed approach, and one will be displaying all content without the need for any additional clicking.
3
Variant one & two mockups
Variant one
This first set of images is the mockups and breakpoints of the tabbed, condensed approach for content. I decided to go with this look because the UX team created a new design system with these new modules. I have not seen these anywhere on the site before and thought that this would be a great way to organize information
Click the image to view all of the breakpoints
Variant two
This second set of images is the mockups and breakpoints of the content that is fully out, with no additional clicks. I wanted to also add this variant into the test because we have had good results on other tests having everything laid out, with no information being hidden. This is the new style of the HPE GreenLake page and wanted to test if this new layout actually made users engage more.
Click the image to view all of the breakpoints
4
Test sessions
Session #1 started December 8, 2020
Since we had three different variants for this A/B test, we decided that we would split these test sessions into two when we got results that favored one variant over the other to see if the results would get even better.
This first session favored the tabbed view with outstanding results.The full content variant had only a .57% conversion rate, with the 'model build' CTA being the most clicked. It also showed a negative rate lift of 87.83% at 100% confidence. We were very surprised by these results because the new HPE GreenLake was using this style for content and decided to do more tests on that section of HPE.
The tabbed variant had a 7.48% conversion rate with the second and third tab being the most clicked. It is also showing a rate lift of 62.15% at 96.77% confidence. These results were amazing to see as the test was only running for one month. We were very excited to see what was to come.
Session #2 started on January 8, 2020
This second session started on January 8th, 2020 and is still going at 50%.After moving to the second test session and removing the third variant, the results got even better!
As of February 8, 2020, the tabbed variant had a 3.45% conversion rate with the 'training' CTA having the most clicks and the 'deployment & monitoring' tab following closely. It also showed a conversion rate lift of 99.79% at 99.92% confidence.
This was definitive proof for us that users preferred the tabbed view over any other that we offer.
5
Impacts
Internal feedback
With results as great as these, this was definitely a test that we brought to the UX team and data team to see what the next steps would be. Everyone in the call believed that these test results were amazing and they wanted to start rolling out this tabbed view to pages that were being newly designed.
For this test in particular, the UX designer who manages this test page decided to keep the test rolling as the results were getting users to engage more. He is looking to redesign the page and wants to keep this tabbed version in it.
These results were shared to our team managers, David and Matt, as well as VP of the digital team, Gabie Boko. They all responded with excitement and were amazed by the results!he business. Overall, the changes we made are likely to have a positive impact on the user experience and the success of our PDF SDK product.
Conclusion
We reached a consensus that tabbed content views were well-received by users and should be consistently implemented on pages that contain substantial content and information. To further validate these findings and eliminate any potential ambiguity that could have influenced the results, we opted to isolate the tabbed component on a single page for testing.
For this purpose, we selected another high-traffic page abundant in content and restructured it to adopt the tabbed view. This additional test will serve to confirm if the positive outcomes observed in the initial test remain consistent, thereby solidifying our initial findings.