Lexicon: Open Source Reference Architecture for Fullstory Integrations

Lexicon is an open-source project and implementation guide developed by Fullstory. It serves as a reference architecture for organizations that need to build custom, highly specific data pipelines.

Looking for the official Fullstory MCP? If you want to connect Fullstory data directly to AI agents using the managed Model Context Protocol (MCP) server, see Fullstory MCP.

What is Lexicon?

Unlike the official Fullstory MCP, which is a hosted, production-ready service, Lexicon provides demonstration implementation patterns and open-code examples. It is designed to help your engineering team understand how to build and deploy two specific types of infrastructure:

Serverless Middleware for Webhook Routing

Lexicon demonstrates how to build a multi-cloud, serverless middleware layer designed to:

  • Receive: Ingest incoming webhook payloads (primarily from Fullstory Streams, but adaptable for any source).
  • Transform: Handle, clean, and reformat data in flight.
  • Enrich: Join Fullstory behavioral data with other internal data sources.
  • Route: Distribute the final payload to one or many destinations (e.g., Slack, JIRA, data warehouses like BigQuery and Snowflake).

This pattern is ideal for teams that require an intermediary to orchestrate complex data flows between Fullstory and the rest of their tech stack.

Custom MCP Tooling (Self-Managed Implementation)

For organizations building their own private MCP servers to support internal AI agents, Lexicon provides a blueprint for turning Fullstory's Global Public APIs into MCP tools. This allows you to:

  • Create bespoke AI tools that interact with specific Fullstory endpoints.
  • Integrate Fullstory data into existing internal AI infrastructure.
  • Control how Fullstory data is exposed to your organization's custom LLM implementations.

Fullstory Integration

Lexicon includes reference implementations for all Fullstory v1 and v2 Server APIs, made available in Fullstory.js. This includes, for example, the ability to create custom events, retrieve user details, generate session summaries, and more.

Lexicon's webhook processing and routing patterns are well suited for handling Fullstory Streams.

Lexicon also includes a self-managed MCP server for Fullstory APIs, enabling AI agents to interact with Fullstory data during software development. The MCP server implementation can be found in the MCP folder and the Fullstory tools in fullstory-tools.js.

Key Considerations

  • Reference only: Lexicon is provided as a demonstration of how integrations can be built. It is not designed to be run as unedited production code. Organizations are responsible for the security, scaling, and maintenance of their Lexicon deployment.
  • Open source: The code is available for you to fork, adapt, and extend based on your unique infrastructure needs.
  • Managed vs. custom: If your goal is to enable AI agents to access Fullstory data without building your own middleware, we recommend using the official Fullstory MCP.

Getting Started

To explore the implementation patterns and access the repository, visit the Fullstory Lexicon GitHub project.

License

Lexicon is open source and available under the MIT License. See the LICENSE file in the Lexicon repository for details.


Was this article helpful?

Got Questions?

Get in touch with a Fullstory rep, ask the community or check out our developer documentation.