Welcome to my AI Blog!

Here, I will be writing reflections for my A-I 285 class at Penn State. Follow along with my AI journey this semester!

Week 1

My First Blog Post!

Hi, I’m Delaney! I’m a senior at Penn State majoring in Human-Centered Design and Development, and I’m excited to be diving deeper into the world of artificial intelligence.

The rapid growth of AI is both fascinating and transformative. I believe it’s more important than ever to not only understand how AI works but also learn how to use it responsibly and effectively. As someone passionate about design and technology, I see AI as a tool that can enhance creativity, problem-solving, and user experience when applied thoughtfully.

Through this class, I’m hoping to strengthen my AI literacy and gain meaningful insights into its emerging role in our daily lives and future careers. To support my learning, I’ll also be experimenting with ChatGPT—using it to help refine my writing, structure my thoughts, and communicate my ideas clearly.

I chose to display this content on my personal website as a way to showcase my learning journey in AI.

I’m looking forward to tracking my growth, sharing my reflections, and building a stronger understanding of how AI can shape the way we learn, work, and design for the future.

Week 2

New Tools and “Vibe Coding”

Key Learning Moments

This week, I learned about different AI tools and the concept of “vibe coding.” Some of the tools we explored included:

  • Lovable – an AI-based website builder

  • Chatfuel – a no-code chatbot builder

  • Zapier – a workflow automation platform

Personal Connections

I think it’s great that there are easy ways to build websites and increase productivity. These platforms make it possible to create without needing to write code, which is especially helpful for smaller projects or early-stage prototypes. In fact, this website is technically a no-code platform too.

That said, I don’t believe no-code should ever fully replace traditional coding. AI isn’t human—it can miss things, overlook context, and produce buggy results if the prompts aren’t crystal clear. Even then, there’s no guarantee the output will be perfect. These tools are incredibly useful, but I think they work best when paired with some tech knowledge and basic coding skills. That way, you can review and refine what the AI produces and use it as a true copilot.

This idea of using AI as a copilot really resonates with me. I enjoy exploring tools that boost productivity and creativity, but I also value having control and understanding over what’s happening behind the scenes. That balance between automation and human oversight is key to building things that actually work.

Challenges and Growth

One challenge I faced this week was pushing myself to try new tools and break out of my AI comfort zone. It’s easy to stick with what you know, but there’s so much out there to explore. I had to remind myself that experimenting—even if it feels awkward at first—is part of the learning process.

AI Tools Documentation

For this specific post, I used Microsoft Copilot to help with wording and structure. It didn’t write the reflection for me, but it helped me organize my thoughts and polish the language so everything flowed better.

I’m looking forward to what next week has in store. There’s still so much to learn, and I’m excited to keep exploring new tools and ideas.

Week 3

AI: A Game-Changer for Early-Stage Ideation

This week, I experimented with using AI as a brainstorming partner for startup ideation, and the results were eye-opening. Here's how I structured my AI-assisted ideation process and what I learned along the way.

My AI Brainstorming Process

I started by clearly stating my startup proposition to the AI. From there, I asked it to help me create detailed personas of potential stakeholders—thinking through who might be interested in, affected by, or involved with my startup idea.

Next came the most interesting part: I conducted mock interviews with the AI, having it roleplay as each of these stakeholder personas. This allowed me to explore different perspectives and uncover potential pain points I hadn't initially considered.

After these "interviews," I asked the AI to summarize the potential user needs that emerged from our conversations. This synthesis was incredibly valuable—it helped me see patterns and priorities across different stakeholder groups.

Using those identified needs, I was able to ideate specific use cases for my startup concept. The AI then helped me think through potential constraints and challenges I might face, giving me a more realistic view of what I'd be getting into.

What Worked Well

The AI excelled at several key areas:

  • Rapid persona generation: Creating detailed, realistic stakeholder profiles in minutes rather than hours

  • Perspective-taking: Roleplaying different viewpoints helped me see blind spots in my thinking

  • Data synthesis: Summarizing insights and presenting them in clear, actionable formats

  • Systematic thinking: Keeping me organized and ensuring I covered all angles

Important Limitations to Remember

While this process was extremely helpful, I want to be clear: AI doesn't replace the importance of connecting with real users and conducting actual customer interviews. Real human insights, market validation, and genuine customer input remain irreplaceable.

What AI provides is an excellent starting point for ideation—a way to rapidly explore possibilities and organize your thinking before you go out into the world to validate your assumptions with real people.

The Power of Human + AI Collaboration

When AI is combined with human experience and intuition, it becomes incredibly insightful. The AI helped me think more systematically and consider angles I might have missed, while my human judgment guided the process and interpreted the results.

For this specific brainstorming session, I used Claude AI, and I'm impressed with its ability to maintain context across our multi-step conversation and provide nuanced responses during the stakeholder roleplay.

What's Next

I'm experimenting with different AI tools each week to see which ones work best for different aspects of the startup process. This comparative approach is helping me understand the unique strengths of each platform.

I'm excited to see what I learn next week and discover new ways to improve my startup ideation process with AI assistance. The key is finding the right balance between AI-powered efficiency and human-centered validation.

Week 4

Testing AI Development Tools

Key Learning Moments

This week, I explored two AI-powered platforms—Lovable and Cursor—that support application and web development through no-code or low-code approaches. What stood out most to me was how quickly AI can generate functioning prototypes from a single prompt. It felt like skipping several steps in the traditional development process, while still being able to refine the outcome if needed.

Testing both platforms helped me understand the range of what AI can offer. Lovable emphasized simplicity and speed, while Cursor blended AI assistance with direct coding. This contrast showed me that choosing the right tool depends heavily on the user’s technical background and creative vision.

Personal Connections

As someone with some technical knowledge, I enjoyed experimenting with these platforms because they simplified development while still allowing me to dig into the code when I wanted to. I like being able to visualize applications in my head and consider what experiences would be most beneficial for the user. Having that foundation made it easier to guide the AI tools and get closer to the results I imagined.

At the same time, I realized how challenging this process might be for someone without a technical background. Knowing how to prompt effectively, or how to make small adjustments in the code, often made the difference between a clunky prototype and one that actually felt usable. This insight connects directly to my long-term goal of working in UX/UI design, where understanding both user needs and technical feasibility is essential.

Challenges and Growth

The biggest challenge I faced was with Lovable. While it generated results quickly, the usability wasn’t aligned with my vision, and I ran into limitations with the free version after only a few prompts. This was frustrating because it stalled my progress and limited experimentation.

However, I grew through this challenge by shifting to Cursor. With Cursor, I learned to prompt more effectively based on the mistakes I had made in Lovable, and the results were much stronger. I appreciated the hybrid model that let me lean on AI for efficiency while still applying my technical knowledge to refine the prototype. This adaptability felt like an important growth moment—learning not just to use the tools, but to pivot when one didn’t meet my needs.

AI Tool Documentation

This week, I used three AI tools:

  • Lovable: to quickly generate prototypes, though I found its usability limited.

  • Cursor: to combine AI-driven coding with manual edits, which produced my best prototype of the week.

  • ChatGPT: to help polish my reflection, organize my writing, and format it into a clear blog post.

Using these tools together enhanced my workflow. They allowed me to prototype more efficiently, reflect more thoughtfully, and ultimately create a more polished product.

Week 5

Empathy-Driven Innovation

Key Learning Moments

This week, I took a meaningful step back from testing AI tools and turned my attention toward the people who will ultimately use the solution I’m building. Instead of a week of testing AI tools, I prioritized understanding human needs. I identified my target audience, crafted thoughtful interview questions, and conducted user interviews to explore how my AI solution might impact their lives. These conversations revealed insights that no dataset could provide, reinforcing the importance of empathy in AI design.

Personal Connections

As someone who enjoys agile development, this approach felt natural. I thrive on iteration and feedback, and user interviews are a key part of that process. It was fascinating to see how an AI solution—often viewed as technical—can be shaped by real human needs and emotions. This experience aligned closely with my interests in building tools that are not only functional but also meaningful to the people who use them.

Challenges and Growth

Interviewing users came with its own set of challenges. Some responses were unexpected or unclear, and a few contradicted my assumptions. Instead of seeing these moments as setbacks, I used them as learning opportunities. I refined my questions, adjusted my approach, and stayed open to unexpected insights. It was a reminder that growth often comes from navigating uncertainty and that the best solutions emerge from listening, not assuming.

AI Tool Documentation

To help me reflect on this experience and organize my thoughts, I used Microsoft Copilot. Copilot supported me in structuring this blog post, identifying key themes, and turning scattered ideas into a clear and cohesive narrative. It wasn’t just a writing tool—it was a thinking partner that helped me articulate the emotional and intellectual journey I went through this week.

This experience reminded me that AI isn’t just about technology—it’s about people. By grounding my process in empathy and agile thinking, I’m not just building a smarter solution—I’m building one that matters.

Week 6

Ai-Powered Prototyping and Storyboarding

This week opened my eyes to a whole new dimension of design workflow—using AI for storyboarding and prototyping. As someone who practically lives in Figma, I was both curious and skeptical about how AI could fit into my established process.

Key Learning Moments

The most significant revelation this week was discovering Figma's AI features, which I had never explored before. Within just a few iterations, the AI generated a highly interactive and impressively solid prototype. What struck me most was the quality of the output—it wasn't just static screens, but a functional, clickable prototype that demonstrated real user flows.

Another eye-opening moment came when I realized I could seamlessly copy pages from the AI-generated prototype directly into my design space for refinement. This bridge between AI generation and manual design work felt like the perfect hybrid approach. The AI wasn't replacing my design skills; it was amplifying them.

I also experimented with Canva AI for storyboarding. While it took some iteration to get close to my vision, the process revealed how AI tools interpret creative direction differently. Watching the AI work through various interpretations taught me to be more specific with my prompts and expectations.

Personal Connections

As a Figma devotee, I've built my entire workflow around efficiency and precision. The idea of AI generating designs initially felt like it might bypass the thoughtful process I value. However, this week shifted my perspective entirely.

I now see AI as an incredible tool for rapid prototyping—perfect for generating initial page ideas and exploring multiple directions quickly. The real magic happens when you combine AI's speed with your own UI/UX knowledge and design taste. It's not about the AI creating perfect interfaces (it definitely had its flaws—missing elements, repetitive buttons), but about accelerating the ideation phase so I can spend more time refining and perfecting.

This approach aligns perfectly with my goals as a designer: work smarter, iterate faster, and focus my energy on the nuanced decisions that truly elevate a design. The AI handles the heavy lifting of layout and structure, while I bring the strategic thinking and aesthetic refinement.

Challenges and Growth

The main challenge I encountered was managing expectations. The AI-generated prototypes weren't perfect—there were definitely gaps, redundancies, and elements that didn't quite match my vision. Initially, this felt frustrating. I found myself thinking, "I could have done this better manually."

But working through this frustration led to an important realization: perfection wasn't the point. The AI gave me a strong foundation in minutes, something that would have taken me significantly longer to build from scratch. The repetitive buttons and missing elements weren't failures—they were opportunities for me to apply my expertise and make the design truly mine.

With Canva AI, the challenge was even more pronounced. It took several attempts to get something close to what I wanted for my storyboard. I had to learn to iterate, adjust my prompts, and be patient with the process. This taught me that working with AI is a skill in itself—it requires clear communication, flexibility, and a willingness to guide the tool toward your vision.

The growth came from embracing AI as a collaborative partner rather than a replacement or a magic solution. These tools don't eliminate the need for design thinking; they enhance it by removing tedious groundwork and letting me focus on what I do best.

AI Tool Documentation

This week, I utilized two primary AI tools:

Figma AI (AI-Powered Prototyping)
I used Figma's built-in AI features to generate interactive prototypes. The tool allowed me to quickly create multiple iterations and export pages directly into my design workspace for further refinement. This significantly accelerated my prototyping process and gave me a solid structural foundation to build upon. The AI's ability to create interactive, clickable prototypes—not just static mockups—was particularly valuable for demonstrating user flows and functionality.

Canva AI (Storyboarding)
I leveraged Canva's AI capabilities for storyboarding work. While it required more iteration and prompt refinement than Figma AI, it ultimately saved considerable time in my creative process. The tool helped me visualize narrative flows and generate visual concepts that I could then refine to match my specific needs.

Claude (Reflection and Writing)
To articulate my thoughts and experiences for this blog post, I used Claude as a writing partner. I provided my raw thoughts and observations about the week's activities, and Claude helped me structure them into a cohesive narrative with clear sections. This tool was particularly valuable for transforming scattered observations into organized insights and finding the right language to express what I was thinking but struggling to articulate. Claude helped me see connections between my experiences and frame them in a way that highlights both the practical applications and the deeper learning moments.

These tools collectively enhanced my reflection process by giving me hands-on experience with AI-assisted design workflows and helping me articulate those experiences clearly. Rather than just theorizing about AI's role in design, I was able to test its practical applications, understand its limitations, and discover how to integrate it meaningfully into my existing process. Using Claude to reflect on using other AI tools added another meta-layer to this learning—experiencing firsthand how AI can support not just the design process, but also the critical thinking and communication that surrounds it. This experiential learning was far more valuable than any abstract discussion could have been.

Week 7

Peer Review and Prototype Validation

This week, we reviewed our peers' prototypes and learned firsthand why human feedback is essential to the design process. While AI tools can generate impressive interfaces quickly, this week reinforced that critical human judgment is irreplaceable when it comes to evaluating whether a design actually solves real problems.

Key Learning Moments

The most important realization this week was understanding that AI-generated prototypes look polished but don't guarantee they accomplish the main user goals. When reviewing our peers' work, we had to ask the fundamental question: does this design actually solve the problem it's supposed to solve? We identified gaps that AI couldn't have caught—missing user flows, unclear interactions, and features that didn't align with the core use cases.

This experience highlighted something crucial: human reviewers bring contextual understanding and real-world insight that algorithms simply can't replicate. We were able to recognize whether designs addressed actual user needs because we could think through the problem from a user's perspective. AI can generate beautiful interfaces, but it can't inherently understand whether those interfaces work for the people using them.

Another key insight was realizing how iteration and feedback are fundamental to prototyping. A prototype isn't a finished product—it's a starting point for dialogue and refinement. The feedback loop between peer review and revision is where real design work happens. Without this iterative cycle, we'd end up with polished designs that don't actually solve problems.

Personal Connections

Reviewing my peers' prototypes helped me understand my own design choices better. Explaining why certain design decisions work (or don't work) in someone else's prototype deepened my understanding of design principles. Receiving feedback on my own work was equally valuable—hearing where users might struggle or where my assumptions diverged from reality was humbling and motivating.

I realized that as designers, our job isn't just to create beautiful interfaces; it's to solve problems. Peer review ensures we're on the right track before investing more time into refinement. The insights from colleagues who approach problems differently pushed me to reconsider assumptions I didn't even know I was making.

Challenges and Growth

The challenge this week was learning to give and receive feedback constructively. It's easy to defend your design choices, but harder to step back and genuinely evaluate whether they serve the user. Learning to separate ego from design work was important—feedback isn't criticism; it's information that helps you make better decisions.

Another challenge was recognizing that not every piece of feedback requires a change. Part of maturity as a designer is evaluating which insights are most valuable and which don't align with your user needs or vision. This discernment only comes from thoughtful consideration of the feedback in context.

The growth came from embracing feedback as a collaborative tool rather than a judgment. I learned that iteration isn't weakness—it's the backbone of good design. The prototypes that will ultimately be strongest are the ones that have been tested, critiqued, and refined through multiple rounds of human feedback. This week made it clear that design is a conversation, not a monologue.

AI Tool Documentation

This week, I utilized Claude AI to articulate my thoughts and experiences for this blog post, I used Claude as a writing partner. I provided my raw thoughts and observations about the week's activities, and Claude helped me structure them into a cohesive narrative with clear sections. This tool was particularly valuable for transforming scattered observations into organized insights and finding the right language to express what I was thinking but struggling to articulate. Claude helped me see connections between my experiences and frame them in a way that highlights both the practical applications and the deeper learning moments.

Week 8

Reflection, Iteration, and Process Analysis

This week, we reflected on our prototypes, applied peer feedback to improve them, and analyzed our design processes. It was all about refinement and learning from each other.

Key Learning Moments

The biggest insight was seeing how peer feedback transformed my prototype. After receiving critiques, I could pinpoint exactly where user goals weren't being met or where flows felt confusing. Fixing these issues helped me understand why they existed in the first place.

Reflecting on my process also revealed patterns in how I work. I noticed where I get stuck, where I move quickly, and which parts energize or drain me. This awareness helps me work smarter going forward.

I learned that iteration needs to be intentional. It's not about randomly tweaking things—it's about making informed changes based on specific feedback to solve actual problems.

Personal Connections

I used to see iteration as a sign I didn't get it right the first time. This week completely changed that mindset. Iteration isn't failure—it's how designs get better. Each round of feedback brings the design closer to actually solving the user's problem.

Reflecting on my process helped me see my strengths and weaknesses. I'm quick at generating concepts (especially with AI), but I need to slow down during validation to make sure those concepts truly work for users.

Peer feedback reminded me how valuable different perspectives are. What's obvious to me might confuse someone else. Collaboration makes the final design stronger.

Challenges and Growth

The hardest part was managing the emotional side of iteration. Receiving feedback meant accepting my prototype wasn't perfect, and making changes meant letting go of decisions I liked. It's humbling when something that made sense to me doesn't work for users.

Another challenge was knowing when to stop iterating. There's always something that could be tweaked. Learning when a prototype is "good enough"—when it solves the core problems—is tricky.

Reflecting on my process was harder than expected. It required honest self-awareness about what's working and what isn't. That discomfort is necessary for growth.

AI Tool Documentation

This week, I utilized Claude AI to articulate my thoughts and experiences for this blog post, I used Claude as a writing partner. I provided my raw thoughts and observations about the week's activities, and Claude helped me structure them into a cohesive narrative with clear sections. This tool was particularly valuable for transforming scattered observations into organized insights and finding the right language to express what I was thinking but struggling to articulate. Claude helped me see connections between my experiences and frame them in a way that highlights both the practical applications and the deeper learning moments. This tool has become my favorite for writing, and I intend to use it each week from now on.

Week 9

Reflection, Documentation, and the Role of AI

This week was focused on reflecting on my design process and documenting how I approached my project. It was about stepping back and analyzing my own workflow, decision-making, and the role AI plays throughout.

Key Learning Moments

The biggest insight came from documenting my design process in detail. By keeping track of each step—what I tried, what worked, what didn't—I could see patterns in how I approach problems, as well as the importance in the role AI plays throughout this process. This documentation became a valuable reference for understanding my own thinking.

Documenting how AI integrated into my process was particularly revealing. I realized that AI wasn't just a feature of my final product—it was woven throughout my entire design workflow. From generating initial concepts to refining prototypes, AI accelerated my process and opened up possibilities I wouldn't have considered on my own.

Reflecting on my process helped me identify where I naturally excel and where I need to be more deliberate. I noticed that I move quickly through certain phases, especially when leveraging AI tools, but need to slow down in others to ensure the human-centered aspects are strong.

I learned that documentation serves multiple purposes. It's not just about recording what happened—it's about creating a foundation for future improvements and learning from both successes and mistakes.

Personal Connections

I used to view reflection as something you do after a project is finished. This week showed me that ongoing reflection and documentation during the process is far more valuable. It helps you course-correct in real-time rather than discovering issues too late.

Taking time to document my process felt tedious at first, but I came to appreciate how it clarified my thinking. Writing things down forced me to be more intentional about my choices, especially regarding when and how to use AI effectively.

Reflecting on AI's role in my process made me aware of how it shapes my design thinking. AI helps me iterate faster and explore more variations, but I need to balance that speed with thoughtful validation. The AI can generate options quickly, but I'm responsible for ensuring those options actually serve user needs.

Challenges and Growth

The hardest part was being disciplined about documentation. It's easy to get caught up in the work and forget to record the reasoning behind decisions. Developing a consistent habit of reflection and documentation took effort.

Another challenge was being honest in my reflections. It's tempting to gloss over mistakes or poor decisions, but real growth comes from acknowledging what didn't work and understanding why.

Understanding where AI fits in my process versus where human judgment is critical was an ongoing challenge. I had to learn when to let AI accelerate my work and when to slow down and apply my own critical thinking. Finding that balance improved both my process and my prototype.

AI Tool Documentation

This week, I utilized Claude AI to articulate my thoughts and experiences for this blog post, I used Claude as a writing partner. I provided my raw thoughts and observations about the week's activities, and Claude helped me structure them into a cohesive narrative with clear sections. This tool was particularly valuable for transforming scattered observations into organized insights and finding the right language to express what I was thinking but struggling to articulate. Claude helped me see connections between my experiences and frame them in a way that highlights both the practical applications and the deeper learning moments. This tool has become my favorite for writing, and I intend to use it each week from now on.

Week 10

Building a Development Plan

This week, I worked on creating a comprehensive development plan for my Nittany AI Challenge project. Using AI assistance, I prioritized features, made technical decisions, and built a realistic roadmap for the next five weeks.

Key Learning Moments

The biggest insight came from applying the MoSCoW prioritization method with AI guidance. Before this exercise, I had a long list of features I wanted to build, but I hadn't really thought critically about what was essential versus what was just nice to have. Working through this process forced me to be honest about what's realistic for my timeline.

Using AI to evaluate feature complexity was eye-opening. The AI helped me see which features would take significantly more time than I initially estimated. Some things I thought were simple turned out to have hidden complexity, while other features I was intimidating myself about were actually quite manageable.

I learned that a good development plan isn't about fitting everything in—it's about identifying the minimum viable product that actually solves the core problem. Everything else can wait or be cut entirely.

Documenting my development roadmap created clarity. Instead of having vague ideas about "building my project," I now have specific tasks with time estimates. This transforms an overwhelming project into manageable steps.

Personal Connections

I used to think planning was boring and got in the way of actually building things. This week completely changed that perspective. Having a clear plan actually makes me more excited to start building because I know exactly what I'm doing and why.

Breaking down my project with AI assistance helped me see that my idea is actually achievable. When it was just one big concept, it felt overwhelming. Now that it's broken into tasks, I can see a path forward.

The prioritization process was humbling. I had to accept that some features I was excited about need to go in the "Won't Have" category. But that's okay—it means my MVP will be focused and strong rather than bloated and unfinished.

Reflecting on technical decisions with AI helped me think through trade-offs I hadn't considered. The AI asked questions that pushed me to justify my choices, which strengthened my understanding of why I'm building things a certain way.

Challenges and Growth

The hardest part was being realistic about scope. I wanted to include every feature I'd brainstormed, but the AI helped me see that wasn't feasible. Accepting those limitations required letting go of attachment to certain ideas.

Another challenge was estimating time accurately. I tend to be overly optimistic about how quickly I can build things. Having AI provide reality checks on complexity helped, but I still had to be honest with myself about my actual available time and skill level.

Making technical decisions was harder than expected. There are so many ways to build an AI application, and choosing between different approaches required thinking through implications I hadn't fully considered. The AI helped surface questions I needed to answer before committing to specific technologies.

Understanding which risks to plan for required careful thought. Not every potential problem is worth worrying about right now, but some risks could derail the entire project if I don't address them early.

AI Tool Documentation

This week, I utilized Claude AI to articulate my thoughts and experiences for this blog post, I used Claude as a writing partner. I provided my raw thoughts and observations about the week's activities, and Claude helped me structure them into a cohesive narrative with clear sections. This tool was particularly valuable for transforming scattered observations into organized insights and finding the right language to express what I was thinking but struggling to articulate. Claude helped me see connections between my experiences and frame them in a way that highlights both the practical applications and the deeper learning moments. This tool has become my favorite for writing, and I intend to use it each week from now on.