2024-CURRENT

Agentic Networking

Redefined networking intelligence — leading design for the industry’s first multi-agent infrastructure platform that learns, adapts, and collaborates with it's operators.

Co-invented "AI Agency for Intelligence Network-Platform Management" (USPTO pending), formalizing safe agent orchestration and oversight

Background + Early Headwinds

Extreme Networks was built on decades of hardware excellence. As the company pivoted toward cloud and AI, I was brought in to help transform it into a software-driven organization — shifting design from aesthetics to strategy and embedding user-centric rhythms across product teams.

The AI Shift: From Research to Product

Until 2024, Extreme’s AI efforts lived quietly in the lab — a small team of engineers and researchers exploring models in isolation. Then the landscape changed overnight: AI funding surged, expectations skyrocketed, and customers demanded answers.


I was asked to lead the design and productization of Extreme’s AI portfolio — to translate research into applied intelligence and bring it to market.

Timeline

The Spark (Q4 2024)

AI’s value is unlocked through design, not just data.
We formed a tiger team with the CTO, VP of Product, AI Researcher, and Design to explore how intelligence should actually live inside the platform.


Through workshops, whiteboards, and rapid prototypes, we realized our challenge wasn’t integrating AI, it was defining a new relationship between humans and intelligent systems.

Workshops

Rapid Prototyping

Conferences & Demos

AI Labs
This is what we had

Our AI lab had started as a modest experiment: mostly a lightweight imitation of models like ChatGPT or Gemini. We knew we needed to go further: to apply AI not as a generic chatbot, but as an adaptive system embedded directly into the network, learning from real operational context and user behavior.

Some of the challenges
we faced as a team

Adding intelligence

What is the least invasive way of integrating explicit AI capabilities to an already built, about-to-ship platform?

GenAI Primitives

How do traditional UX building blocks evolve when AI becomes a co-contributor?

How do traditional UX building blocks evolve when AI becomes a co-collaborator?

Agent Orchestration

How to navigate agency, user expectations, discovery, and recoverability of a multi-agent system? We really didn’t have any examples!

WORKSHOPS

Insight and Takeaways

These sessions revealed our real challenge — not integrating AI, but redefining how intelligence should live within the product.

Two Core Experiences

Two core experiences emerged from this work: Our AI conversational UX/UI and our GenAI Canvas.

AI Conversational UX/UI

(Chat Invoked)

Expanded Chat

Expanded Chat

GenAI Canvas

The Breakthrough (Q4 2024)


To scale this intelligence layer, we had to design the architecture — a world where agents could actually live, communicate, and evolve. We were operating in uncharted territory — few precedents existed for applied AI in networking. Also, everyone had a different approach as to how an agent architecture and taxonomy should be instantiated.

Once we defined the agent architecture, the next step was to make it tangible.
AI Exchange was our way of giving structure a face — a place where users could actually see, manage, and understand the intelligence operating on their behalf.

Agent Mental Model

This mental model visualizes how users engage with AI-driven capabilities across three key phases: Explore, Configure, and Monitor within the Agent Exchange ecosystem.


  • Explore: Users discover and evaluate new agent capabilities, focusing on questions like What can it do for me? and Is it secure?

  • Configure: Users begin tailoring the agent experience, seeking clarity on control, cost, and activation.

  • Monitor: Users assess ongoing value and impact, asking Is this helping me? and Am I getting my money’s worth?


The circular flow reflects a continuous feedback loop—exploration informs configuration, monitoring reveals new opportunities, and the cycle repeats—driving trust, adoption, and perceived value across the agent lifecycle.

In Q2, design expanded in two directions at once — horizontally across the organization and vertically into strategy.

Horizontally, we codified the new language of intelligence — principles, primitives, patterns — embedding them in our design system so every team could build coherently.


Vertically, that same language started shaping business strategy.

“A new language for intelligence”

GenAI Primitives
A taxonomy of new design affordances

The Expansion (Q2 2025)

In Q2, design expanded in two directions at once — horizontally across the organization and vertically into strategy.

Horizontally, we codified the new language of intelligence — principles, primitives, patterns — embedding them in our design system so every team could build coherently.


Vertically, that same language started shaping business strategy.

A new language for intelligence

GenAI Primitives
A taxonomy of new design affordances

Chat with Everything

Every object becomes conversational — data, UI, docs, devices.

Every object becomes conversational — data, UI, docs, devices.

Format Translation

Converting between modalities and representations — text → table → diagram → video → code.

Converting between modalities and representations — text → table → diagram → video → code.

Multi-Player

Shared environments where humans and agents act concurrently.

Shared environments where humans and agents act concurrently.

Semantic Resizing

Semantic Resize

Adjusting meaning dynamically — shorter, deeper, simpler, more formal.

Adjusting meaning dynamically — shorter, deeper, simpler, more formal.

Review & Supervision

Semantic Resize

Humans shift from maker to editor, curator, supervisor.

Humans shift from maker to editor, curator, supervisor.

Attention Allocation

Semantic Resize

Agents operate asynchronously; humans orchestrate priorities and focus.

Agents operate asynchronously; humans orchestrate priorities and focus.

The Alpha Launch (Q3 2025)

By Q3, everything we’d been designing — the architecture, the primitives, the language — finally came together. We called this phase the Alpha Launch. Not because the product was finished, but because the system was alive and working.


It was the first moment we could feel intelligence in the experience.

Our design language, our AI pattern library, and the underlying architecture were now working in sync — design had effectively codified intelligence across the platform.


This wasn’t a pixel launch. It was a systems launch — a proof that design could serve as the connective tissue between research, engineering, and strategy

Outcomes

  1. Framework adopted across the organization

  2. Established the foundation for Extreme’s AI product strategy
    and the GTM roadmap

  3. Filed a provisional patent and defined a new design language for agentic systems

  4. Influenced next-gen networking platform architecture

What I learned

Learned to design with uncertainty

You can’t design for AI the same way you design for deterministic systems.

Lead by design instead of architecture

Early on, system architecture dictated experience. That proclivity inverted over time.

Prototyping leading to a design language

Prototypes are only as powerful as the principles they encode. Developed a method for converting exploration into a reusable pattern language for AI interactions (principles → primitives → patterns → components).

From Dialogic to Co-Adaptive to Collaborative

AI experiences are dialogic, co-adaptive, and ultimately collaborative — systems that learn along side with us, not just from us.

Scroll to top

Agentic Networking

Please enter the password to continue