Bringing Clarity to AI Landscaping Tool

Helping users understand what the tool offers and how to customize it

Overview
Neighborbrite is a startup offering an AI-powered landscaping tool that helps homeowners generate design inspiration for their yards. The company aimed to identify where users were getting stuck, specifically to reduce login drop-off and increase engagement with the paid customization feature.

In moderated user testing, many participants found the tool’s capabilities unclear during sign-up, and the customization input unintuitive and below expectations. We proposed design to improve the clarity and usability of the AI features.
Design for AI
User Testing
UX Design
Project Type
UX consulting for Neighborbrite (In-class project for external client)
Timeline
Nov 2023 - Dec 2023
Team
4 graduate students in UX design, including myself
My Role
During the research phase, we contributed equally in a collaborative classroom setting. In the design phase, we divided tasks, and I focused on the landing page and customization feature, both of which are highlighted in this case study.
Impact
We delivered design recommendations with mockups and a comprehensive report. The client responded positively to our work, and we contributed to their success in reaching 100,000 users within a year.
Context

We worked with startup offering AI-powered landscaping tool

Our client, Neighborbrite, is a startup offering an AI-powered landscaping tool that lets users upload photos of their yard and receive design inspirations. When the project began, the app was only 4 months old, and the team needed help understanding users and improving the UX.

Problems
Login drop-off at the start of the flow
The CEO expressed a desire to reduce the drop-off rate, which was around 30% at the time.
Getting pro plan customers
The CEO also mentioned wanting to encourage usage of the customization feature, part of the paid Pro plan, and was curious about how it’s perceived by users.
Goal

Identify and address usability issues that cause user drop-off or confusion during AI landscape design generation

Research

Tested with homeowners interested in landscaping

First, we conducted user testing to see how free and pro (paid) users interact with the tool.

Methodology
Moderated remote user testing
Participants
8 homeowners with interest in landscaping
Scope
AI landscape image generation &
Customize feature (Pro plan). See more
Device
Mobile, which is a major device for users
Insights
Sign-up hesitation due to unclear tool purpose
Requiring sign-up upfront caused hesitation, as users didn’t fully understand how the feature worked. The demo video on the landing page was often overlooked, as it was placed far down the page.
75% hesitated to sign up due to limited understanding of the tool.
Opportunity 1: Clarify what the product offers
Unclear, unintuitive AI customization with unmet expectations
Users were confused about how the AI customization worked and wanted a more intuitive interface. The app often returned odd or unrealistic designs, leading to reduced trust in the AI feature.
62.5% wanted intuitive customization controls that communicate what's possible.
87.5% expressed that the AI customization functionality and its results did not meet their expectations.
Opportunity 2: Clarify customization input options and improve input usability
Solution - Opportunity 1

Highlight the demo video and delay login to improve clarity and trust for tool

Users tend to tap the “Try now” button shortly after arriving on the landing page, but many hesitate on the next screen, the login page, as they don’t fully understand what the tool does. To address this, I recommended making the demo video more noticeable to help users understand the product.

Issue
The demo video was placed below the fold, and many users missed it.
Soltution
Providing a "Watch Demo" button that scrolls to the video on click ensures easy access and improves visibility.

I also recommended that, if feasible from an engineering or business perspective, the login or sign-up step be delayed until just before generating the design, or only when credentials are needed, so users can explore the tool first and feel more confident about signing up.

Solution - Opportunity 2

Use checkboxes or suggestions to guide AI customization inputs

Users were confused by the free-text input for adding or avoiding elements. Since these are simple entries, users didn’t expect a fully free-text format and instead looked for more efficient input. The limited examples in the input field didn’t clearly convey the range of acceptable inputs.

Issue
The simple input examples made users question the need for free-text input and left them unsure what to enter.

To address this, I aimed to clarify input examples and improve the input efficiency. I came up with 2 solutions: Checkboxes - combine checkbox-based selection for common elements with a free-text input for other elements. Suggestions- show examples and common elements as users type. In the final presentation, I proposed the checkbox idea because it helps users more easily grasp the range of customizable elements, and 25% of participants explicitly expressed a need for checkboxes (filters).

* This is a slightly refined version of the original mockup.
Solution
Checkboxes Idea
Pro
Easy to grasp common element categories and examples, and easy to input.
Con
Checkboxes cover only common elements, not all possibilities, and may discourage creative text input.
* This is a slightly refined version of the original mockup.
Solution
Suggestions Idea
Pro
Integrates examples more naturally than checkboxes.
Con
Difficult for users with no idea what to customize to grasp the range of common elements and categories.
🚨 If I were doing this today, I’d design to guide users in building prompts, so they can get the most out of the AI. I go into more detail later on this page. View new idea
Result

Client satisfaction and contribution to their success

At the end of this project, we presented with the client as well as detailed report to share the users’ feedback from testing sessions. The CEO described satisfaction for our work. We contributed to the clients’ success of reaching 100,000 users within a year.

"This is extremely useful. This is exactly what I was hoping to see!"
Luis, CEO of Neighborbrite
Next Steps

Measure conversion and customer satisfaction

As external UX consultants, we didn’t have the opportunity to measure the impact. If I were part of their team, I would use the following metrics to evaluate success.

Login conversion rate
Evaluate whether the update reduces login hesitation.
Time on task
Measure whether the update improves customization clarity and reduces confusion.
Customer satisfaction score
Measure satisfaction with customized images by embedding a feedback feature.

If I were to do it again, I’d explore this direction.👇

What was missing?

Address broader yard design needs beyond specifying elements by leveraging AI

We assumed that adding or avoiding specific elements was the primary need. However, the participants didn’t always have specific elements in mind; instead, they expressed more abstract or general needs, such as “low maintenance” or “suitable for dry climates.” Since the strength of AI lies in its flexibility, helping users write prompts based on vague needs might have been the better solution.

Lightning demo

How do other AI tools guide prompt input?

I looked into other AI tools to see how they assist with prompt writing and found a common pattern: showing example prompt sentences in easily accessible ways.

I researched Freepik, Canva, Claude, and ChatGPT. Their solutions range from random prompt generators (like a dice button) to simple lists of example prompts.
Design

Provide prompt example template

Inspired by a lightning demo, I designed a prompt-assist feature. It displays common requests that users typically have as starting points. Users can tap a button to add these sample prompts. If there's already a prompt in the field, they can choose to either append or replace it.

Solution
Pro
Easy to understand expected prompts and take advantage of AI capabilities.
Con
The examples are limited and don’t show specific elements.
How to iterate?

Need a more detailed feedback mechanism

During the project, the CEO expressed interest in understanding how users feel about the customization experience. However, there’s no built-in way to collect detailed feedback on each generated output or capture user reactions. While user testing helps, it isn’t scalable. To evaluate the AI customization UX, I also want to recommend adding a feedback feature like the one below.

Issue
While there is a global feedback feature, it's difficult to gather detailed feedback in the context of a specific image and prompt.
Solution
Collecting detailed feedback for each AI-generated image, along with the prompt, user reaction, and comments, can provide valuable insights.