Been experimenting with connecting prompt-based workflows directly into Figma.
I call it vibe designing, turning your text prompt into a real, native Figma layout.
Works inside Figma. No exports. No switching tools.
Built a simple working prototype using Vibe coding (ChatGPT + Claude) AI tools stitched together in a messy but functional setup. Not perfect. A lot to fix.
Next experiments I’m exploring:
- Design styles
- Use design tokens + systems
- Screenshot → design
- Design variations
- Responsive layouts
- Cleanup & UX audit
Curious what other directions I should explore.
Would something like this speed up your workflow? Or change how you start a design?
Been experimenting with connecting prompt-based workflows directly into Figma.
I call it vibe designing, turning your text prompt into a real, native Figma layout. Works inside Figma. No exports. No switching tools.
Built a simple working prototype using Vibe coding (ChatGPT + Claude) AI tools stitched together in a messy but functional setup. Not perfect. A lot to fix.
Next experiments I’m exploring: - Design styles - Use design tokens + systems - Screenshot → design - Design variations - Responsive layouts - Cleanup & UX audit
Curious what other directions I should explore. Would something like this speed up your workflow? Or change how you start a design?