SYS.NAME  : ZUI_OS v1.0.0
SYS.AUTH  : GUEST_ACCESS_GRANTED
SYS.NODE  : zui.ooo
UPTIME    : calculating...
TERMINAL  : TTY0
STATUS    : 200
________ ___ ___ ___ ________ |\_____ \|\ \|\ \|\ \ |\ ____\ \|___/ /\ \ \\\ \ \ \ \ \ \___| / / /\ \ \\\ \ \ \ \ \ \ / /_/__\ \ \\\ \ \ \ __\ \ \____ |\________\ \_______\ \__\\__\ \_______\ \|_______|\|_______|\|__\|__|\|_______|
$ whoami
Product designer who builds. BS Statistics (UMich), MS Data Vis (Parsons).
4+ years shipping consumer products end-to-end -- from Chrome extensions to AI agent UX.
Currently shipping Melo -- a vocabulary tool that lives inside Claude.
$ cat status.txt
LOCATIONCA -- NYC
FOCUSAgentic UX -- Consumer AI -- Interface design
CONTACThi@zui.ooo
TIP: ↑↓ arrows to navigate -- type help for commands
$ ls -la /projects/
4 entries -- click filename to expand -- click [OPEN] to read case study
NAMESIZEMODIFIEDDESCRIPTION
MELO.connector 12.4kb2025-03 Vocab tool living inside Claude via MCP
DESCiOS app + MCP server syncing vocab from Claude conversations. Ask Claude about a word -- it saves with pronunciation, conjugation, gender.
ROLESolo -- design, Swift, MCP server
STATUSLIVE -- App Store  ·  IN SUBMISSION -- Claude Connector Directory
$ cat MELO.connector [OPEN]
AI_AGENT_UX.case 38.1kb2025-01 Trust design when the AI is the actor
DESCiOS app for AI-managed investment positions + vault platform where agents autonomously allocate across markets.
ROLEProduct design lead -- Art director
STATUSUNRELEASED -- case study on request
$ cat AI_AGENT_UX.case [OPEN]
CONSUMER_PRODUCTS.log 94.7kb2021-2023 3 products -- build what I design -- 60k users
DESCIdentity extension -- social passport web app -- iOS + Android information browser. Each was a deliberate pivot.
ROLEfounding designer & brand lead -- end-to-end product design -- front-end
METRICS60k users (identity ext, ~5mo) -- 320k website visitors
$ cat CONSUMER_PRODUCTS.log [OPEN]
BRAND_MOTION.dir 6.5kb2023-2024 Rebrand -- NFC tap experience -- 3 cities 4k+ attendees
REBRANDFull identity relaunch Feb 2024. Visual system + motion language + campaign.
EVENTSDenver 2023 -- Singapore 2024 (Google APAC HQ) -- Bangkok 2024 -- 1k-2k+ attendees each.
STATUSDELIVERED
$ cat BRAND_MOTION.dir [OPEN]
WINDSPELL.app --2025 City weather as a generative kalimba instrument
DESCType a city. Live wind data drives a rotating kalimba ring -- speed is tempo, direction is which note plays. 14 scales mapped to cultural regions by coordinates. React + Three.js + Web Audio API.
ROLESolo -- concept, design, build
STACKReact · Three.js · GSAP · Web Audio API · GLSL
$ cat WINDSPELL.app [OPEN]
PAGURO.wip --2025 Token cost companion creature -- early prototype
DESCA terminal/widget app that converts LLM token consumption into an in-world currency. Feed and customize a small creature, buy it a new shell -- making abstract API costs tangible and a little playful.
ROLESolo -- concept, design, early prototyping
WHYTokens are invisible and abstract. Paguro asks: what if consumption had a face? It is also an excuse to think about non-extractive AI product design.
STATUSEARLY WIP -- not yet public
$ cat PAGURO.wip [OPEN]
back /projects/ / MELO.connector
$ cat MELO.connector
01 / 04
MELO
A vocabulary tool that lives inside Claude
ROLESolo -- design + Swift + MCP
YEAR2025
PLATFORMiOS + Claude Connector
STATUSLIVE -- App Store  ·  IN SUBMISSION -- Claude Connector Directory
// the problem

I use Claude a lot. When I am reading something in French -- an article, a recipe, a film -- I ask Claude about words I do not know. Claude explains it well. I think: I should write that down. I do not. By tomorrow it is gone.

The problem is not that I lack a vocabulary app. I have tried them. Duolingo turns it into a game. Anki gives you a graveyard of flashcards for words you never actually cared about. Neither touches the two things that matter most in French: conjugation and grammatical gender.

The deeper problem: none of these apps know which words I actually encountered. They give you a list someone else decided was important. But the words worth learning are the ones that show up in your own life -- in the film you watched, the conversation you had, the thing you looked up at midnight.

// the insight

The right place to capture a word is not a separate app you open. It is the moment you are already in -- the conversation with Claude. I was already asking Claude anyway. I just needed Claude to remember.

MCP made that possible. Connect Melo to Claude and it quietly saves every word you ask about -- the IPA pronunciation, an example sentence, the meaning, conjugation, grammatical gender -- automatically, in the background. No copy-pasting. No switching apps. The word shows up in Melo because you already asked about it. That is the whole interaction.

// who it is for

Built for people who watch French films, read Italian recipes, chat with an AI tutor at midnight.

Not for streaks. For the words that actually show up in your life.

Sign in with Apple, paste your MCP link into Claude's settings, and that is it.

// design decisions

What is on a word card: Pronunciation, example sentence, conjugation table, grammatical gender. These four and nothing more. English translation is deprioritized -- the goal is to stay in the target language as long as possible.

Gender as color: Masculine and feminine nouns are color-tagged consistently. Gender becomes a visual fact you absorb over time rather than memorize consciously.

Flashcard design: The card tests conjugation, not definition. Most apps ask "what does this mean?" -- Melo asks "how do you use it?" A deliberate departure from the Anki model.

No word lists: Melo does not give you a curated list to work through. It only shows you words you actually encountered. The words worth learning are the ones that already showed up in your life.

The sync UX: The hardest problem was making the Claude-to-Melo connection feel invisible rather than bolted on. The word should feel like it was always there.

// try it

Melo is live on the App Store and in submission to Claude's Connector Directory. Download on the App Store, connect to Claude, and ask about any word in French, Spanish, Italian, or Japanese.

Melo sync flow — Claude to MCP to iOS
back /projects/ / AI_AGENT_UX.case
$ cat AI_AGENT_UX.case
02 / 04
AI AGENT UX
Designing trust when the AI is the actor
ROLEProduct design lead -- Art director
YEAR2025
PLATFORMiOS -- Web
STATUSUNRELEASED
// the design problem

Most product design assumes the user is the one taking action. The interface is a surface for human decisions. What happens when the AI is the actor?

At an AI startup in 2025, I worked on two products that put this question at the center.

// key design tensions

Transparency vs. noise: Showing every AI decision creates log fatigue. Hiding decisions creates distrust.

Control vs. automation: Every intervention UI you add undermines the value of the AI. The upfront configs -- the prompt, the constraints, the operation styles -- need to be expansive and deliberate. Once that step feels owned, the runtime feels earned rather than imposed.

Trust signals: Whether it is a smart contract or any other constraint -- the real safety harness is invisible because users cannot see code. The design challenge is making a technical guarantee feel human: legible limits, not reassuring copy.

// product 01 -- ai as financial companion

The concept: an AI that could safely manage investment positions on your behalf. Hard constraints enforced by smart contracts -- not just UI affordances -- meant the AI could not spend more than you had configured. A genuine architectural safety guarantee, not a design-layer promise.

The interface looked like a messaging app. Deliberately. If you are going to have an AI managing your money, natural language is still the most flexible way to express what you want -- "take profit if we're up 20%" is clearer than any form UI. I used in-chat interactable widgets for quick actions that needed precision: confirm a transaction, check a live market chart, review a position. The chat handled intent; the widgets handled execution.

// design decision -- widgets + chat
Design decision: widgets embedded in chat UI
// interface concept

Recreated in lo-fi to illustrate the interaction model -- original product under NDA.

9:41
AI
Financial companion
● active
Good morning. 2 market signals overnight. Want a briefing?
Send $18 to Charlie for dinner
Of course. Confirm the details:
Transaction
Amount$18.00 USDC
To0x3f4a...c18e
Fee~$0.12
Confirm
Cancel
Buy $500 ETH if it drops 5%
Done — limit order set. I'll notify you when it triggers.
Message
Natural language + inline transaction widget
9:41
AI
Financial companion
● active
2 signals detected overnight —
On-chain
Whale wallets accumulating $LINK
Large wallet clusters building positions over 6h. Potential price move signal.
Regulatory
EU crypto regulations announced
MiCA may require licensing and KYC for some DeFi protocols.
Read more ↗
What regulation?
MiCA applies to crypto-asset service providers but isn't designed to regulate DeFi directly. Some protocols may still need licensing and KYC depending on classification.
Message
Market signals briefing + AI drill-down
9:41
AI
Financial companion
● active
Buy me some AI coins
No problem — confirm below.
Trade
Invest$1,220.00
AssetAI index basket
Incl. fee$1,244.40
Rate1 ETH = 2,524 USDC
Confirm
Cancel
What's my portfolio up today?
You're up $84.20 (+2.3%) today. Largest gain from the ETH limit order that triggered this morning.
Message
Trade confirmation + portfolio summary
9:41
Journal
Today
4:41
Sent to Charlie
− $18.00
4:30
Moved to yield pool
− $500.00
4:28
Received from Casey
+ $500.00
Yesterday
8:12
AI exited — abnormal signal detected
+ $240.00
2:14
Limit order executed
− $500.00
11:30
AI reallocated to higher yield
+ $18.40/mo est.
Activity journal — every action in plain language
// product 02 -- ai-managed fund platform

Think of a fund platform where instead of a human portfolio manager running each fund, an AI agent runs it. Users could initiate a vault with a prompt -- an investment style, a risk threshold, a market thesis -- and the agent would allocate and rebalance 24/7 within those parameters.

Early tests showed something genuinely useful: the agent could detect abnormal market signals and exit positions safely ahead of volatility -- something a human manager sleeping in a different timezone simply cannot do. Early performance data showed returns comparable to or higher than equivalent human-managed funds -- at a fraction of the management cost. And because the constraints were enforced at the contract level, it was structurally more secure than a human-managed equivalent.

Where traditional fund platforms ask you to configure parameters -- risk tolerance, allocation limits, rebalancing rules -- this one asks you to describe your strategy in plain language. The AI reads intent and extracts structure, surfacing its confidence on each parameter before you commit.

The confirm step is not about filling in fields. It is about verifying that the AI understood you correctly.

// launch -- film + motion

I art directed the video for Product 01. It was live action, post-futurist western. The setting sits somewhere between frontier town and near-future city. Characters with 80s silhouettes -- leather, volume, a little theatrical -- inside a world that is clean, architectural, slightly cold. The tension between those two aesthetics was intentional: familiar human warmth inside an AI-managed system.

The Product 02 video was motion graphics -- I designed the product and then built the video myself in After Effects and Cavalry. The systematic visual language was a deliberate extension of the product's logic: where the first video leaned into character, this one leaned into structure.

// design decision -- prompt-first vault creation
1._vault_info
2._strategy
3._markets
4._confirm
Strategy
Describe your strategy in plain language.
VAULT_STRATEGY
// EXTRACTED_PARAMS
CONSERVATIVE preset max_position 70% stablecoin preferred rebalance 7d min_apy 4% timelock -- unset mgmt_fee -- unset
green = extracted with confidence  ·  orange = inferred, review before confirming  ·  dim = not mentioned

Claude Sonnet  ·  2 params need review
Trust is not a feeling here. It is a state the interface makes visible.
// runtime -- positions + decision log

The runtime view. Live positions and P&L alongside the agent's full decision log -- every move plotted on the chart, every rationale one click away. The design goal was not to hide what the AI did. It was to make the AI's reasoning as navigable as the outcome.

TOTAL VALUE
$12,440
P&L TODAY
+$284
AGENT STATUS
active · 7d
portfolio value — 7 days
positions
ETH / USDC
1.2 ETH · limit $5,000
+$148
+2.1%
BTC / USDC
0.08 BTC · limit $5,000
+$96
+1.4%
LINK / USDC
exited · was $1,800
+$40
+2.2%
USDC reserve
idle · yield 4.1% APY
decision log
holdtoday · 04:28
ETH — held through overnight volatility
trailing stop not triggered · signal above threshold
exittoday · 02:14
LINK — exited ahead of abnormal signal
3 correlated wallets accumulating · pre-volatility exit
buyyesterday · 18:03
BTC — entered on momentum signal
RSI recovery · within configured max position limit
holdyesterday · 11:30
ETH — rebalanced to yield pool
idle capital above floor · auto-routed to USDC yield
decision trace
click a log entry or chart point to inspect the agent's rationale
back /projects/ / CONSUMER_PRODUCTS.log
$ cat CONSUMER_PRODUCTS.log
03 / 04
CONSUMER PRODUCTS
Three products, eighteen months, sixty thousand users
ROLEfounding designer & brand lead -- end-to-end design -- front-end
YEARS2021 -- 2023
PLATFORMSChrome -- Web -- iOS -- Android
METRICS60k users -- 320k visitors
// context

In October 2021 I co-founded a startup around a simple but stubborn idea: platforms should not own your information. Your content, your identity, your audience -- these should belong to you, portable across any app, visible to whoever you choose. Not because of ideology, but because the current model is structurally broken. A platform can delete your account and everything you built disappears with it.

The thesis was open information as infrastructure -- a layer beneath applications where your data lives independently of any single platform. We spent three years trying to find the right product shape for that idea, moving from a browser extension to a social identity layer to a search engine. Each product taught us something the previous one could not.

// identity extension

A browser extension that surfaced your online social identity contextually as you browsed. The core product decision was frictionless onboarding through claiming -- you already had content scattered across platforms. The extension let you link existing posts and domains to a portable identity file you owned, without creating anything new or changing how you posted. Content synced automatically in the background.

The zero-friction insight: nothing moves, nothing migrates. The file just starts pointing at what you already made.

Identity extension sync flow diagram

60k users in approximately 5 months.

// social passport

A social identity layer that connected on-chain assets with human-readable profiles. The problem was one profile, wildly different content types -- NFTs, donations, game scores, written posts. Each had a different shape.

The key interaction was the showcase management model. Inspired by iOS homescreen folders: content is clustered by type but drag-to-list and drag-to-hide let users curate exactly what appears on their profile. Listed items surface on the Vitrine. Everything else stays unlisted but accessible.

320k website visitors -- full design system built from scratch.

// interface
Manage Showcase
User Profile
// information browser

A pivot toward open information as a browsing and search experience. If your
identity and content should be portable and platform-independent, then so should
the way you discover information. We shipped a web app, then iOS and Android,
then a search engine format -- each iteration trying to find the right container
for the same underlying idea. Deprecated Feb 2024.

iOS + Android shipped -- full mobile design system.

One of the larger design investments was the action icon system -- over 30 icons covering every on-chain and social action type the feed could surface. Each icon uses a combination of shape and background color to encode meaning across two channels, so actions are scannable before reading any text. Shape families are grouped by category: social actions share one visual language, financial actions another, governance a third. The redundancy was intentional -- color alone fails in low-contrast environments, shape alone fails at small sizes. Together they hold.

// before / after
blockchain explorer — raw data
etherscan.io/address/0xa3f2b91c4e8d2f1a7c3b...
Txns
Token Txns
Transactions
Internal
Token
NFT
Txn HashMethodAgeFromValue
0x7f3a...e2c1
Transfer2 hrs0x9b2d...44fa0 ETH
0x21d8...9b3e
Mint1 day0xf4a1...c77d0.08 ETH
0x88bc...3d9f
swapExact3 days0x3e7f...a12b0.5 ETH
0xf4a1...c77d
castVote2 days0x88bc...3d9f0 ETH
information browser — human readable
AllNFTsFinanceGovernancePosts
nova_kaypublished on Mirror
Why the open web needs a new identity layer
On portability, ownership, and what it means to exist across platforms without losing yourself in any of them.
M
no gas
382 hrs agoPost
nova_kaymintedCool Cat #2333
🐱
Cool Cat #2333
CoolCatsContract · Ethereum
E
− 0.08 ETH
1 day agoNFT
nova_kayswapped on Uniswap
+ 1,240 USDC− 0.5 ETH
U
− 0.003 ETH
3 days agoFinance
nova_kayvoted onProposal #44
Treasury Reallocation — Season 12
Voted Yes · 200 voting power · Gitcoin DAO
G
no gas
2 days agoGovernance
// design system
Colors
primary
#0072FF
text
#2C2C2E
success
#79B346
warning
#EAC028
error
#ED675E
gray
#F0F0F4
Buttons & filters
AllNFTsFinancePosts
Typography — Noto Sans
h1 Semibold 20pt
h2 Semibold 16pt
Body Regular 14pt · 150% line height
Caption 12pt · muted
Label 10pt · extralight 200
Network badges
E
U
G
M
O
A
L
T
Action icons
post
mint NFT
donate
swap
vote
deposit
bridge
withdraw
receive
comment
collect
stake
Feed card — anatomy
nova_kay published on Mirror
Why the open web needs a new identity layer
On portability, ownership, and what it means to exist across platforms.
M
no gas
382 hrs agoPost
Radius
small · 4px
medium · 8px
large · 12px
Spacing
multiples of 2px
base unit: 4px
gap: 8 / 12 / 16px
Shadows
small · 0 2 10 1px
large · 0 8 64 0px
opacity 6–10%
Tags
PostNFTFinanceGovernance
// action icon system
NFT ACTIONS
mint NFT
acquire
burn NFT
sell NFT
send NFT
collect
edit NFT
like NFT
FINANCE ACTIONS
swap
deposit
withdraw
receive
send token
bridge
stake
add liquidity
SOCIAL & GOVERNANCE
post
comment
like
share
follow
unfollow
vote
propose
CONTENT & IDENTITY
donate
launch grant
create profile
receive POAP
burn POAP
revise
link
music NFT
// what i learned about pivoting

Kill the form, keep the thesis. Every pivot killed a product but kept the idea. The design work was not wasted -- it taught us what shape the idea needed to be.

Speed is a design constraint. At a small founding team pace I had to design systems that could be built fast and iterated without full rewrites. This made me ruthless about component decisions.

60k users is a signal, not a destination. The identity extension's growth was fast and organic -- but it happened in a specific market moment that did not last. Learning to read that signal early -- and move before the trough -- was the most valuable thing I built in that period. More valuable than any of the products.

back /projects/ / BRAND_MOTION.dir
$ cat BRAND_MOTION.dir
04 / 04
BRAND & MOTION
Identity relaunch -- animation series -- global events
ROLECreative director -- Art director
YEARS2023 -- 2024
EVENTSDenver -- Singapore -- Bangkok
REACH1k -- 2k+ per event
// feb 2024 -- identity relaunch

Directed a full brand relaunch timed as a pre-heat for mainnet deployment. The brief: move from early-startup visual language to something that could represent a protocol -- durable, systemic, trustworthy, but not corporate. Deliverables: visual system, typography, color, motion language, and launch campaign.

// events

Organized and art-directed three open-house events across two years -- founder/CEO panels, fireside chats, and product sessions.

I also designed a custom NFC wristband for the events. Tap it to another attendee's phone and it pulls up their social profiles from a companion web app. Attendees could give each other a small endorsement, which pushed them up a live leaderboard. A physical expression of the open identity thesis -- and something people actually played with.

NFC wristband flow
NFC wristband design
NFC wristband in use

Denver (2023) -- 3-day open house. 1,200+ attendees.

Singapore (2024) -- Google APAC HQ, one-day panels. 1,500+ attendees.

Bangkok (2024) -- Google Bangkok, one-day panels + launch party. 2,000+ attendees.

// animation series -- open information / open web

Creative directed a short animated series explaining the concept of open information. Worked with an animator from script to final delivery. The brief was to avoid every aesthetic trap of "blockchain explainer" videos -- no floating coins, no buzzword soup. Closer to a Kurzgesagt-style educational piece: clear concepts, deliberate pacing, visual metaphors that do actual explanatory work.

back /projects/ / WINDSPELL.app
$ cat WINDSPELL.app
LIVE
WINDSPELL
Live city weather as a generative kalimba instrument
ROLESolo -- concept, design, build
YEAR2025
STACKReact · Three.js · GSAP · Web Audio API
// the idea

Every city has wind. Wind has speed, direction, pressure, humidity. None of that is music -- but all of it is data with shape and rhythm.

Windspell asks: what if a city's live weather became a generative instrument? Not a visualization, not a graph. A sound you could actually sit with.

// how it works

Type a city. The app looks up its coordinates, derives a musical scale from the cultural region, and hashes the city name to a root note. Then it starts polling the weather API.

The visual center of the app is a ring of 72 tick marks -- a physical kalimba model. Nine of those ticks are active tines, each tuned to a scale degree. The ring rotates continuously. Wind speed controls how fast it spins. A needle sits fixed at the live wind direction. When a tine crosses the needle, it gets plucked.

Wind speed is tempo. Wind direction is which note plays. That is the whole mechanic.

// the generative music system

14 musical scales are mapped to geographic regions by latitude and longitude bounding boxes -- Aeolian for Nordic latitudes, Hijaz for the Middle East, Yo for Japan, Blues for the American South, Slendro for Southeast Asia. The root note comes from a djb2 hash of the city name modulo 12 chromatic semitones above C3.

Pressure, humidity, and gust data then modulate the audio continuously. High pressure brightens the filter. High humidity stretches the reverb tail. Gusty wind adds harmonic overtone content. Wind direction shifts the stereo pan field.

Chord voicing follows the region's timbre. Cold regions (Arctic, Nordic, Andean) voice an open 4th plus a sparse 6th, with long reverb. Warm regions (European, Mediterranean, Blues) voice a full triad. Bright regions (African, Japanese, Oceanic) use a clean open fifth only.

The result: Cairo and Reykjavik do not sound alike. They use different scales, different root notes, different waveforms, different reverb lengths. The same city sounds different at dawn versus midnight -- the background shader uses a real solar position algorithm, so the lighting shifts with actual sun position.

01 // SIGNAL CHAIN
INPUT
WIND
DATA
60s poll · OWM API
GEOMETRY
RING
ROTATION
speed × 4 deg/s
COLLISION
TINE
PLUCK
at wind direction
SYNTHESIS
KALIMBA
NOTE
scale · timbre · reverb
OUTPUT
SPATIAL
AUDIO
panned stereo field
02 // WIND → RHYTHM
RING ROTATION SPEED
deg_per_sec = wind_speed(m/s) × 4 // calm 0 m/s → 0 deg/s · gale 10 m/s → 40 deg/s
EXAMPLES
Calm
0.5 m/s → 2°/s
Breeze
3.5 m/s → 14°/s
Strong
7 m/s → 28°/s
Gale
12 m/s → 48°/s
WIND DIRECTION → WHICH NOTE
WIND DIR 320°
crossed = ((tineAngle + ringAngle) − windDir + 360) % 360 // fires when crossed < 1.2° — direction-aware, no pre-trigger
03 // WIND SPEED → DYNAMICS + ANIMATION
velocity = min(wind_speed(m/s) / 15, 1.0)// 0 = silent   1.0 = full amplitude   saturates at gale force
bend_degrees = 22 × (tine_height / 18) × max(velocity, 0.35)// longer (lower-pitch) tines deflect more — mimics physical behaviour
spring_duration = 0.85 + (tine_height / 18) × 0.45 seconds// longer tines oscillate more slowly — elastic.out(1, 0.32) easing
04 // WEATHER → TIMBRE MODULATION
PARAMETERRANGEAUDIO EFFECTFORMULA
PRESSURE
960 → 1040 hPa
LOW
HIGH
Filter brightness
Low pressure → murky, dense
High pressure → bright, open
cutoff × (0.75 + Δp × 0.5)
HUMIDITY
0 → 100%
DRY
WET
Reverb wet mix + tail length
Desert → short, direct
Tropical → long, diffuse
wet = 0.20 + h × 0.35
tail × (0.6 + h × 0.8)
GUSTS
0 → 2× wind speed
STEADY
GUSTY
Harmonic overtone content
Turbulent air → richer texture
2nd oscillator at 2× freq
g = (gusts−spd)/spd
harmonic += g × 0.20
WIND DIR
0° → 360°
N
S
Stereo pan offset
N wind → field shifts left
S wind → field shifts right
pan += sin(dir_rad) × 0.15
05 // CITY COORDINATES → MUSICAL SCALE
SCALE SELECTION
// 14 scales mapped to lat/lon bounding boxesscale = selectScale(lat, lon)// e.g. Europe lat>55° → AEOLIAN · Japan 129–146°E → YO
// root note: djb2 hash of city name → semitone 0–11hash = name.reduce((h,c) => (h×31+c) & 0xffff, 5381)root_hz = 130.81 × 2^(hash % 12 / 12)// C3 = 130.81 Hz · one of 12 chromatic semitones
LATITUDE → TEMPO + TIMBRE
bpm = 120 − (|lat| / 90) × 60// equator = 120 bpm · poles = 60 bpm
EQUATOR
120 bpm · bright
MID-LAT
90 bpm · warm
POLAR
60 bpm · cold
ALL 14 CULTURAL SCALES
MINOR PENTARCTIC (lat > 63°)sine · cold
AEOLIANNORDIC (lat 55–63°, lon −30–40°)sine · cold
DORIANWESTERN EUROPE (lat 35–55°, lon −30–40°)triangle · warm
PHRYGIANMEDITERRANEAN (lat < 35°, lon −30–40°)triangle · warm
HIJAZMIDDLE EAST (lat 10–42°, lon 35–65°)sawtooth · warm
AFRICAN PENTSUB-SAHARAN AFRICA (lat −35–15°, lon −20–50°)triangle · bright
BLUESNORTH AMERICA EAST (lat > 20°, lon −100– −30°)sawtooth · warm
MAJOR PENTNORTH AMERICA WEST (lat > 20°, lon −170– −100°)triangle · bright
ANDEANANDES / S. AMERICA (lat < −10°, lon < −65°)sine · cold
BHAIRAVSOUTH ASIA (lat 5–35°, lon 65–90°)sine · cold
SLENDROSOUTHEAST ASIA (lat −10–25°, lon 90–130°)sine · warm
YOJAPAN (lat 30–46°, lon 129–146°)sine · bright
GONGEAST ASIA (lat 20–50°, lon 90–130°)triangle · bright
WHOLE TONEOCEANIA (lat −10– −50°, lon 110–180°)triangle · bright
06 // CITY EXAMPLES
TOKYO
35.68°N 139.69°E
SCALE
YO
0, 2, 5, 7, 9…
ROOT
F#3
hash % 12 = 6
TIMBRE
BRIGHT
sine · 97 bpm
REYKJAVIK
64.13°N −21.94°E
SCALE
MINOR PENT
0, 3, 5, 7, 10…
ROOT
D3
hash % 12 = 2
TIMBRE
COLD
sine · 57 bpm
CAIRO
30.04°N 31.24°E
SCALE
HIJAZ
0, 1, 4, 5, 7…
ROOT
A3
hash % 12 = 9
TIMBRE
WARM
sawtooth · 80 bpm
LAGOS
6.52°N 3.38°E
SCALE
AFRICAN PENT
0, 2, 4, 7, 9…
ROOT
C#3
hash % 12 = 1
TIMBRE
BRIGHT
triangle · 116 bpm
07 // CHORD VOICING BY CULTURAL TIMBRE
COLD
Arctic · Nordic · Andean · South Asian
root + 4th (+25ms, ×0.42)+ sparse 6th (+55ms, ×0.22)
Open, spacious intervals.
Long reverb (2.4–3.5s).
WARM
European · Mediterranean · Blues · SE Asian
root + 3rd (+18ms, ×0.38)+ 5th (+38ms, ×0.28)
Full triad voicing.
Mid reverb (1.1–2.0s).
BRIGHT
African · Western · Oceanic · Japanese · East Asian
root + 5th (+12ms, ×0.32)// clean open fifth only
Minimal, percussive.
Short reverb (0.4–0.7s).
// design decisions

The resonator form was not the obvious first choice. Five directions were explored -- a breathing circle, concentric orrery rings, radial petals, a Lissajous curve, and soft diffused arcs. The concentric ring model survived because it was the only one that directly modeled the physical mechanics: a ring sweeping tines past a fixed needle is exactly how a kalimba works. The visual and the audio became one system.

Webcam and cloth simulation were scoped early and removed. A fully procedural GLSL shader -- driven by cultural region hue, live weather uniforms, and solar position -- is faster, more reliable, and creates a more distinct visual identity than a background feed.

WINDSPELL // RESONATOR DESIGN OPTIONS

01 // BREATHING CIRCLE
ring pulses with wind · ripple on pluck · minimal
02 // RADIAL PETALS
9 petals by pitch · glow on pluck · mandala
03 // LISSAJOUS CURVE
wind-driven parametric curve · continuously morphing
04 // DIFFUSED ARCS
soft aurora bands · bloom on pluck · pure light
// what i built

Three.js for the fullscreen GLSL shader. GSAP for the Resonator ring animation and tine bend physics. Web Audio API for the synthesis chain -- oscillators, bandpass noise, convolver reverb, stereo panning, biquad filter. OpenWeatherMap for live wind data, polled every 60 seconds.

The audio chain runs a continuous drone at half the root frequency and bandpass-filtered wind noise that scales with wind speed. Per-pluck notes layer on top: the main oscillator, a harmony note from the chord voicing, and an overtone oscillator that scales with gustiness. Everything routes through a master gain with a wet/dry reverb split that rebuilds when humidity shifts significantly.

React + TypeScriptThree.jsGSAPWeb Audio APIGLSLOpenWeatherMap
windspell.zui.ooo
back /projects/ / PAGURO.wip
$ cat PAGURO.wip
WIP
PAGURO
A small creature that lives on your token spend
ROLESolo -- concept, design, prototyping
YEAR2025
PLATFORMTerminal / macOS widget
STATUSEARLY WIP
// the idea

Token costs are invisible. You send a message, something happens, compute is consumed -- and you never feel it. The abstraction is total. For most users this is fine. But for people building on top of LLMs, or just thinking carefully about their AI usage, the invisibility creates a strange disconnect.

Paguro is a small terminal app and macOS widget that converts your token consumption into an in-world currency -- "pp" (paguro points). Spend tokens, earn pp. Use pp to feed your creature, buy it a new shell, keep it alive. Abstract API costs become something you can see, feel, and even care about a little.

// why it is interesting to design

Most AI product design is about removing friction and hiding cost. Paguro goes the other direction -- it makes cost visible, but translates it into something playful rather than stressful. The design question is: how much visibility is useful before it becomes anxiety-inducing? Where is the line between awareness and guilt?

There is also something worth exploring in the creature format specifically. A tamagotchi-style companion that depends on your AI usage creates a relationship with your tooling that is genuinely different from a dashboard or a usage meter. It is not extractive -- you are not being warned or penalized. You are just... tending something.

// current state

Very early. The concept is clear, the mechanic is defined, early prototyping is underway. No public release yet. Included here because the design thinking is real even if the product is not finished.

$ cat about.md

I think the most interesting design problem right now is the one nobody has a pattern for yet: what does an interface feel like when the AI is the actor?

Not "here is a chatbox." Not "here is an AI feature inside your existing product." But: the AI just did something on your behalf. It made a call. Now what? How do you design for trust, for legibility, for the user's sense of agency -- when the decision was not theirs?

I have been living inside that question at a startup where AI agents managed real money on behalf of users. No established patterns. You figure it out or the product fails.

I have a stats background (UMich) and a data visualization degree (Parsons). I code what I design. I built Melo -- a vocabulary tool on the App Store that connects to Claude -- because I was learning French and nothing handled conjugation and gender well. That is roughly how I work: see the problem, build the thing.

hi@zui.ooo
$ cat resume.txt
// experience
Founding Designer · Product & Brand Lead Company: Early-stage AI Startup (stealth) Dates: 2021.10 -- PRESENT

- Founding designer across 3 consumer products end-to-end: Chrome ext -- web app -- iOS + Android
- 60k users in 5mo (identity ext.) -- 320k website visitors -- full design system ownership
- Designed NFC networking hardware + companion web app for 3 events across Denver, Singapore, Bangkok (4.7k+ attendees)
- Creative directed brand relaunch + animated explainer series; art directed launch video (AE + Cavalry)
- 2025 spin-out: product design lead for AI agent investment platform and iOS trading app

// education
M.S. Data Visualization School: Parsons School of Design, The New School Dates: 2018 -- 2020
B.S. Statistics School: University of Michigan Dates: 2014 -- 2018
// skills

Design: Figma -- design systems -- interaction design -- motion direction -- motion graphics (AE, Cavalry) -- brand identity
Build: React -- HTML/CSS/JS -- Swift -- MCP server development -- Blender
AI: Claude API -- MCP integration -- AI agent UX
Other: developer relations -- event production -- creative direction
Languages: Mandarin -- intermediate French & German -- conversational Japanese

zui@portfolio:~$