Solar Gardens -Reducing User Friction in a Renewable Energy Enrollment Experience
Consumers Energy 2025
Experience Design Internship
During my Experience Design internship at Consumers Energy in Summer 2025, I was tasked with refreshing the Solar Gardens webpage, which had not been updated in years.
Through research, I identified that the experience struggled to convert interested users because people could not clearly understand how the program worked or what they would receive in return.
I led the redesign to reduce confusion, clarify the value of the program, and make key information easier to find, helping users make more confident decisions about whether to enroll.

Research
Test Questions
I conducted unmoderated usability testing using UserTesting.com to better understand how users interacted with the page.
-
7 participants
-
Midwest residents responsible for their utility bills
-
Tested on the existing webpage
Focus Areas:
-
Clarity and Communication of Information
-
Visual Design and Usability
-
Relevance and Personal Resonance
1. Spend 1-3 minutes reviewing this webpage. If you click on any elements, please return to the originally linked page. Think aloud as you review it. 2. What parts of the page drew your attention the most? 3. In your own words, describe how the solar gardens program works 4. If you did not already, please watch the video 5. How did you feel about the video? 6. How if at all did the information in the video change your perception of the program? [Verbal response] 7. Which section of the page gave you the clearest idea of how the program works? Why was that section helpful 8. The content is easy to understand. Explain your answer. [5-point Rating scale: Strongly disagree to Strongly agree] 9. At any point, did you lose interest or feel overwhelmed? If so, what specific sections or elements caused that feeling? 10. Did any part of the page confuse you? If so, what was unclear or difficult to understand? 11. At any point, did you encounter any difficulty navigating the page? If so, please explain where this occurred. 12. To what extent do you feel the program resonates with you, if at all? Why? 13. From viewing this page alone, I feel confident that I would join the described program. Explain your answer. [5-point Rating scale: Strongly disagree to Strongly agree] 14. If you didn’t sign up today, what additional information or reassurance would you need to feel comfortable doing so? 15. On a scale of 1 (poor) - 5 (excellent), how would you rate the layout of this page. Please explain your answer. [5-point Rating scale: Poor to Excellent] 16. On a scale of 1 (difficult) - 5 (easy), how would you rate the ease of finding information on this page? Please explain your answer. [5-point Rating scale: Very difficult to Very easy] 17. What additional feedback, if any, do you have about what is being communicated on this webpage?
Synthesis
I synthesized feedback using an affinity diagram to identify patterns across user responses.
-
grouped raw quotes into themes
-
identified recurring confusion points
-
translated those into actionable insights

Research uncovered
Through usability testing, I found that the main issue was not just that the page felt outdated.
Users struggled to understand:
-
How the program works
-
What they would receive in return
-
How to evaluate whether the program was worth it
What users were saying
-
“Would it save you money in the long run?”
-
“I don’t understand the subscription model”
-
“Is this a donation?”
User motivations
Through usability testing, I noticed that users approached the program with different expectations, particularly around financial value and environmental impact.
I grouped participants into three categories based on their responses:
Cost Saver
Focused on financial return and wanted clear breakdowns of cost and credits
Eco-Conscious
Motivated by contributing to clean energy and reducing environmental impact
Ownership Advocate
Preferred owning solar panels and questioned the long-term value of a subscription model
While motivations differed, a consistent pattern emerged:
Users expected to understand what they would gain from the program, but struggled to interpret the value clearly.
-
“What am I getting in return?”
-
“Would it save you money?”
What this revealed
The program aligns most clearly with users who are motivated by environmental impact, but the experience does not make this distinction explicit.
At the same time, users who approached the program expecting financial benefit were left confused, because:
-
They did not clearly understand the cost structure
-
The value exchange was not immediately obvious
-
Key financial details were difficult to find
Key Issues
Users lacked clarity on how the program works
Some users were unsure how the subscription model functioned or how credits were applied.
-
“I don’t understand the subscription model”
-
“What am I getting in return?”
Users wanted clearer information about cost and value
Users consistently questioned whether the program provided meaningful financial benefit.
“Would it save you money in the long run?”
Users did not access critical information
Users did not engage with additional content that explained the program.
-
0% of users clicked “Learn More”
As a result, users missed key details about:
-
how the program works step-by-step
-
cost and credit structure
-
commitment (12-month minimum) and flexibility
-
what happens after enrollment
Helpful content was not surfaced effectively
A video explaining the program provided one of the clearest explanations, but users did not initially engage with it.
-
Most users skipped the video during initial exploration
-
When prompted, 85% found it helpful
User feedback:
-
“The video explained more about how the program actually works”
-
“It was the clearest part”
An Infographic explaining key information was unclear to many users.
Information overload made the page difficult to navigate
-
“Too much information”
-
“Hard to find what the program is about”
Users struggled to identify key details within dense content.

Key observation
Users are missing out on vital information that could be the determining factor for whether they enroll in Solar Gardens.
How that shaped the redesign
This shifted the project from a general page refresh into a clarity and information architecture problem.
Instead of only modernizing the page visually, I focused on:
-
Surfacing key information earlier
-
Clarifying cost and credit details
-
Making the value of the program easier to understand
-
Restructuring the page so users could understand the program before deciding whether to enroll
Design Decisions
Clarified the value of the program upfront
Users interpreted the program differently and were unsure what they would gain.
I introduced three clear benefits at the top of the page and made environmental impact more prominent.

Surfaced critical information
Important details were hidden behind “Learn More” content that users did not access.
I moved this information directly onto the main page so users could understand:
-
how the program works
-
cost and credits
-
commitment details
Previously, users would only see this information after clicking “Join Now” so, users struggled to evaluate the financial aspects of the program:
-
surfaced pricing per solar block
-
clarified differences between the two options
-
brought the calculator from the enrollment flow onto the main page

Simplified dense content
Combined key info from dense text and the infographic into a simple, bullet-based format that’s easier to scan.
So users could to quickly understand the program without digging through long text or confusing visuals.

Adding CTA's
-
The original page only had a call to action at the top.
-
I added additional CTAs throughout the page so users could take action after understanding the program

Key Takeaways
The redesigned experience:
-
made key information easier to find
-
improved clarity of how the program works
-
reduced effort required to interpret content
Follow-up validation
I conducted a small follow-up usability test to gather user sentiment on the redesign.
While not a direct comparison, feedback indicated that:
-
previous confusion points were reduced
-
users were better able to understand the program
-
information felt easier to navigate
What I learned
Updating an outdated experience isn’t just about improving visuals.
Research revealed deeper issues in clarity and information structure, and addressing those had a greater impact on usability than surface-level changes alone.
