tejas parthasarathi sudarshan

Tejas Parthasarathi Sudarshan

Hello! I'm Tejas and I build things!

Right now I'm building FanDesk. The core idea: every organization's knowledge, chat, tasks, docs, email, calendar, contacts, should live in one unified memory. Not stitched together with integrations. Natively connected. We built a full workspace on top of that memory (replaces Slack, Notion, Linear, Gmail, Calendar), but the workspace isn't the point. The memory is. Because when all context lives in one place, AI can actually do real work. It can reason across everything, not just what's in one app. That's what we're building toward: AI employees that operate with full organizational context. 400+ MCP tools so you can run it from Claude Code or any AI app you already use.

I built FanDesk with a modified version of Skillgraph, an open-source agentic AI framework I wrote. Instead of giving agents a mess of tool functions, Skillgraph uses skills, more sophisticated units that handle tasks with their own logic and workflows. They orchestrate multiple tools, manage state, and act as subagents when needed.

Before FanDesk, I built Fanpit, a community management platform for brands. CRM, loyalty, stores, memberships, events, all in one place. It started as a brand-building tool and pivoted when we saw community was the strongest growth channel.

On the research side, I'm working on Neural DNA (NDNA), a compact genome for growing network architecture. The latest result: 354 parameters wire GPT-2 Small (paper). A tiny genome controls 35.4 million connections at a 99,970:1 compression ratio, and the resulting model beats GPT-2 on 3 benchmarks (WikiText-103, Penn Treebank, LAMBADA perplexity) while permanently disabling one-third of its masked connections. The genome discovers that deep layers need full connectivity while shallow layers can be aggressively pruned, and it even changes its mind mid-training, resurrecting a layer it had killed 25,000 iterations earlier. The interactive visualization lets you watch the topology evolve in real time. The first paper (small-scale experiments) showed genomes of 226 to 374 parameters controlling up to 2.2M connections across MLPs, CNNs, transformers, and video models. The core idea: neural networks should grow their topology like biological brains do, default disconnected, with metabolic cost pressure forcing selectivity.

I came from accounting. Chartered Accountancy from ICAI, audit at KPMG. Then I taught myself to build. No CS degree, no bootcamp. Just a strong opinion that the tools we use to work are broken, and the stubbornness to try fixing them.

When I'm not building, I'm on random side quests. Music, movies, markets, and technology are things I have strong opinions about and am always happy to argue over.


Sidequests

  • Skillgraph - Open-source agentic AI framework. Skill-based architecture for LLM agents
  • MIDICode - Turn your MIDI keyboard into a macOS system controller
  • Let's Find Sanity - Anonymous journaling platform for founders and builders
  • Kodart - Transforming code into visual art
  • TapMeow - Tap your MacBook, get a meow

If you'd like to get in touch, I'm always happy to hear from curious people. Best way to reach me is by email: tejas@fandesk.ai