Skip to content

Indexing NFT Transfers on Moonbeam with Subsquid

by Massimo Luraschi


Subsquid is a data network that allows rapid and cost-efficient retrieval of blockchain data from 100+ chains using Subsquid’s decentralized data lake and open-source SDK.

The SDK offers a highly customizable Extract-Transform-Load-Query stack and indexing speeds of up to and beyond 50,000 blocks per second when indexing events and transactions.

Subsquid has native and full support for both the Ethereum Virtual Machine and Substrate data. This allows developers to extract on-chain data from any of the Moonbeam networks and process EVM logs as well as Substrate entities (events, extrinsics and storage items) in one single project and serve the resulting data with one single GraphQL endpoint. With Subsquid, filtering by EVM topic, contract address, and block range are all possible.

This guide will explain how to create a Subsquid project (also known as a "squid") from a template (indexing Moonsama transfers on Moonriver), and change it to index ERC-721 token transfers on the Moonbeam network. As such, you'll be looking at the Transfer EVM event topics. This guide can be adapted for Moonbase Alpha as well.

The information presented herein is for informational purposes only and has been provided by third parties. Moonbeam does not endorse any project listed and described on the Moonbeam docs website (

Checking Prerequisites

For a Squid project to be able to run, you need to have the following installed:


This tutorial uses custom scripts defined in commands.json. The scripts are automatically picked up as sqd sub-commands.

Scaffold a Project From a Template

We will start with the frontier-evm squid template available through sqd init. It is built to index EVM smart contracts deployed on Moonriver, but it is also capable of indexing Substrate events. To retrieve the template and install the dependencies, run

sqd init moonbeam-tutorial --template frontier-evm
cd moonbeam-tutorial
npm ci

Define Entity Schema

Next, we ensure that the data schema of the squid defines entities that we would like to track. We are interested in:

  • Token transfers
  • Ownership of tokens
  • Contracts and their minted tokens

Luckily, the EVM template already contains a schema file that defines the exact entities we need:

type Token @entity {
  id: ID!
  owner: Owner
  uri: String
  transfers: [Transfer!]! @derivedFrom(field: "token")
  contract: Contract

type Owner @entity {
  id: ID!
  ownedTokens: [Token!]! @derivedFrom(field: "owner")
  balance: BigInt

type Contract @entity {
  id: ID!
  name: String
  symbol: String
  totalSupply: BigInt
  mintedTokens: [Token!]! @derivedFrom(field: "contract")

type Transfer @entity {
  id: ID!
  token: Token!
  from: Owner
  to: Owner
  timestamp: BigInt!
  block: Int!
  transactionHash: String!

It's worth noting a couple of things in this schema definition:

  • @entity - signals that this type will be translated into an ORM model that is going to be persisted in the database
  • @derivedFrom - signals that the field will not be persisted in the database. Instead, it will be derived from the entity relations
  • type references (e.g. from: Owner) - when used on entity types, they establish a relation between two entities

TypeScript entity classes have to be regenerated whenever the schema is changed, and to do that we use the squid-typeorm-codegen tool. The pre-packaged commands.json already comes with a codegen shortcut, so we can invoke it with sqd:

sqd codegen

The (re)generated entity classes can then be browsed at src/model/generated.

ABI Definition and Wrapper

Subsquid maintains tools for automated generation of TypeScript classes for handling Substrate data sources (events, extrinsics, storage items). Possible runtime upgrades are automatically detected and accounted for.

Similar functionality is available for EVM indexing through the squid-evm-typegen tool. It generates TypeScript modules for handling EVM logs and transactions based on a JSON ABI of the contract.

For our squid we will need such a module for the ERC-721-compliant part of the contracts' interfaces. Once again, the template repository already includes it, but it is still important to explain what needs to be done in case one wants to index a different type of contract.

The procedure uses a sqd script from the template that uses squid-evm-typegen to generate Typescript facades for JSON ABIs stored in the abi folder. Place any ABIs you requre for interfacing your contracts there and run:

sqd typegen

The results will be stored at src/abi. One module will be generated for each ABI file, and it will include constants useful for filtering and functions for decoding EVM events and functions defined in the ABI.

Define and Bind Event Handler(s)

Subsquid SDK provides users with the SubstrateBatchProcessor class. Its instances connect to chain-specific Subsquid archives to get chain data and apply custom transformations. The indexing begins at the starting block and keeps up with new blocks after reaching the tip.

The SubstrateBatchProcessor exposes methods to "subscribe" to specific data such as Substrate events, extrinsics, storage items or, for EVM, logs and transactions. The actual data processing is then started by calling the .run() function. This will start generating requests to the Archive for batches of data specified in the configuration, and will trigger the callback function, or batch handler (passed to .run() as second argument) every time a batch is returned by the Archive.

It is in this callback function that all the mapping logic is expressed. This is where chain data decoding should be implemented, and where the code to save processed data on the database should be defined.

Managing the EVM contract

Before we begin defining the mapping logic of the squid, we are going to rewrite the src/contracts.ts utility module for managing the involved EVM contracts. It will export:

  • Addresses of Gromlins contract
  • A function that will create and save an instance of the Contract entity to the database
  • A function that will return a Contract instance (either the already existing one, or a newly created entity). The first time the function is called, it verifies if a Contract does exist already, in the negative case, it will invoke the first function, and cache the result, so on subsequent calls the cached version will be returned

Here are the full file contents:

// src/contract.ts
import { Contract as ContractAPI } from './abi/erc721';
import { BigNumber } from 'ethers';
import { Context } from './processor';
import { Contract } from './model';

export const contractAddress = 'wss://';

export async function createContractEntity(ctx: Context): Promise<Contract> {
  const lastBlock = ctx.blocks[ctx.blocks.length - 1].header;
  const contractAPI = new ContractAPI(
    { ...ctx, block: lastBlock },
  let name = '',
    symbol = '',
    totalSupply = BigNumber.from(0);'Creating new Contract model instance');
  try {
    name = await;
    symbol = await contractAPI.symbol();
    totalSupply = await contractAPI.totalSupply();
  } catch (error) {
      `[API] Error while fetching Contract metadata for address ${contractAddress}`
    if (error instanceof Error) {
  return new Contract({
    id: contractAddress,
    name: name,
    symbol: symbol,
    totalSupply: totalSupply.toBigInt(),

let contractEntity: Contract | undefined;

export async function getContractEntity(ctx: Context): Promise<Contract> {
  if (contractEntity == null) {
    contractEntity = await, contractAddress);
    if (contractEntity == null) {
      contractEntity = await createContractEntity(ctx);
  return contractEntity;

You might notice a warning that the Context variable hasn't been exported, but don't worry, as we'll export it from the src/processor.ts file in the next section.


The createContractEntity function is accessing the state of the contract via a chain RPC endpoint. This is slowing down the indexing a little, but this data is only available this way. You'll find more information on accessing state in the dedicated section of our docs.

Configure Processor and Attach Handler

The src/processor.ts file is where squids instantiate the processor (a SubstrateBatchProcessor in our case), configure it and attach the handler functions.

Not much needs to be changed here, except adapting the template code to handle the Gromlins contract and setting the processor to use the moonbeam archive URL retrieved from the archive registry.

To test out the examples in this guide on Moonbeam or Moonriver, you will need to have your own endpoint and API key, which you can get from one of the supported Endpoint Providers.

If you are adapting this guide for Moonriver or Moonbase Alpha, be sure to update the data source to the correct network:

  chain: process.env.RPC_ENDPOINT, // TODO: Add the endpoint to your .env file
  archive: lookupArchive('moonbeam', { type: 'Substrate' }),
  chain: process.env.RPC_ENDPOINT, // TODO: Add the endpoint to your .env file
  archive: lookupArchive('moonriver', { type: 'Substrate' }),
  chain: process.env.RPC_ENDPOINT, // TODO: Add the endpoint to your .env file
  archive: lookupArchive('moonbase', { type: 'Substrate' }),


The lookupArchive function is used to consult the archive registry and yield the archive address, given a network name. Network names should be in lowercase.

You'll also need to modify the Context type so that it is exported and can be used in the src/contract.ts file.

export type Context = BatchContext<Store, Item>;

Here is the end result:

// src/processor.ts
import { lookupArchive } from '@subsquid/archive-registry';
import { Store, TypeormDatabase } from '@subsquid/typeorm-store';
import {
} from '@subsquid/substrate-processor';
import { In } from 'typeorm';
import { ethers } from 'ethers';
import { contractAddress, getContractEntity } from './contract';
import { Owner, Token, Transfer } from './model';
import * as erc721 from './abi/erc721';
import { EvmLog, getEvmLog } from '@subsquid/frontier';

const database = new TypeormDatabase();

const processor = new SubstrateBatchProcessor()
    // FIXME: set RPC_ENDPOINT secret when deploying to Aquarium
    //        See
    chain: process.env.RPC_ENDPOINT || 'wss://',
    archive: lookupArchive('moonbeam', { type: 'Substrate' }),
  .addEvmLog(contractAddress, {
    filter: [[]],

type Item = BatchProcessorItem<typeof processor>;
export type Context = BatchContext<Store, Item>;, async (ctx) => {
  const transfersData: TransferData[] = [];

  for (const block of ctx.blocks) {
    for (const item of block.items) {
      if ( === 'EVM.Log') {
        // EVM log extracted from the substrate event
        const evmLog = getEvmLog(ctx, item.event);
        const transfer = handleTransfer(block.header, item.event, evmLog);

  await saveTransfers(ctx, transfersData);

type TransferData = {
  id: string;
  from: string;
  to: string;
  token: ethers.BigNumber;
  timestamp: bigint;
  block: number;
  transactionHash: string;

function handleTransfer(
  block: SubstrateBlock,
  event: EvmLogEvent,
  evmLog: EvmLog
): TransferData {
  const { from, to, tokenId } =;

  const transfer: TransferData = {
    token: tokenId,
    timestamp: BigInt(block.timestamp),
    block: block.height,
    transactionHash: event.evmTxHash,

  return transfer;

async function saveTransfers(ctx: Context, transfersData: TransferData[]) {
  const tokensIds: Set<string> = new Set();
  const ownersIds: Set<string> = new Set();

  for (const transferData of transfersData) {

  const transfers: Set<Transfer> = new Set();

  const tokens: Map<string, Token> = new Map(
    (await, { id: In([...tokensIds]) })).map((token) => [,

  const owners: Map<string, Owner> = new Map(
    (await, { id: In([...ownersIds]) })).map((owner) => [,

  if (process.env.RPC_ENDPOINT == undefined) {
    ctx.log.warn(`RPC_ENDPOINT env variable is not set`);

  for (const transferData of transfersData) {
    const contract = new erc721.Contract(
      { height: transferData.block },

    let from = owners.get(transferData.from);
    if (from == null) {
      from = new Owner({ id: transferData.from, balance: 0n });
      owners.set(, from);

    let to = owners.get(;
    if (to == null) {
      to = new Owner({ id:, balance: 0n });
      owners.set(, to);

    const tokenId = transferData.token.toString();

    let token = tokens.get(tokenId);
    if (token == null) {
      token = new Token({
        id: tokenId,
        // FIXME: use multicall here to batch
        //        contract calls and speed up indexing
        uri: await contract.tokenURI(transferData.token),
        contract: await getContractEntity(ctx),
      tokens.set(, token);`Upserted NFT: ${}`);
    token.owner = to;

    const { id, block, transactionHash, timestamp } = transferData;

    const transfer = new Transfer({




It is also worth pointing out that the contract.tokenURI call is accessing the state of the contract via a chain RPC endpoint. This is slowing down the indexing a little bit, but this data is only available this way. You'll find more information on accessing state in the dedicated section of the Subsquid docs.


This code expects to find a URL of a working Moonbeam RPC endpoint in the RPC_ENDPOINT environment variable. Set it in the .env file and in Aquarium secrets if and when you deploy your squid there. We tested the code using a public endpoint available at wss://; for production, we recommend using private endpoints.

Launch and Set Up the Database

When running the project locally it is possible to use the docker-compose.yml file that comes with the template to launch a PostgreSQL container. To do so, run sqd up in your terminal.

Squid projects automatically manage the database connection and schema via an ORM abstraction. In this approach the schema is managed through migration files. Because we made changes to the schema, we need to remove the existing migration(s) and create a new one, then apply the new migration.

This involves the following steps:

  1. Build the code:

    sqd build
  2. Make sure you start with a clean Postgres database. The following commands drop-create a new Postgres instance in Docker:

    sqd down
    sqd up
  3. Generate the new migration (this will wipe any old migrations):

    sqd migration:generate
  4. Apply the migration, so that the tables are created in the database:

    sqd migration:apply

Launch the Project

To launch the processor run the following command (this will block the current terminal):

sqd process

Finally, in a separate terminal window, launch the GraphQL server:

sqd serve

Visit localhost:4350/graphql to access the GraphiQL console. From this window, you can perform queries such as this one, to find out the account owners with the biggest balances:

query MyQuery {
  owners(limit: 10, where: {}, orderBy: balance_DESC) {

Or this other one, looking up the tokens owned by a given owner:

query MyQuery {
  tokens(where: {owner: {id_eq: "0x5274a86d39fd6db8e73d0ab6d7d5419c1bf593f8"}}) {
    contract {

Have fun playing around with queries, after all, it's a playground!

Publish the Project

Subsquid offers a SaaS solution to host projects created by its community. All templates ship with a deployment manifest file named squid.yml, which can be used, in conjunction to the Squid CLI command sqd deploy.

Please refer to the Deploy your Squid section on Subquid's documentation site for more information.

Example Project Repository

You can view the template used here, as well as many other example repositories on Subsquid's examples organization on GitHub.

Subsquid's documentation contains informative material, and it's the best place to start, if you are curious about some aspects that were not fully explained in this guide.

This tutorial is for educational purposes only. As such, any contracts or code created in this tutorial should not be used in production.
The information presented herein has been provided by third parties and is made available solely for general information purposes. Moonbeam does not endorse any project listed and described on the Moonbeam Doc Website ( Moonbeam Foundation does not warrant the accuracy, completeness or usefulness of this information. Any reliance you place on such information is strictly at your own risk. Moonbeam Foundation disclaims all liability and responsibility arising from any reliance placed on this information by you or by anyone who may be informed of any of its contents. All statements and/or opinions expressed in these materials are solely the responsibility of the person or entity providing those materials and do not necessarily represent the opinion of Moonbeam Foundation. The information should not be construed as professional or financial advice of any kind. Advice from a suitably qualified professional should always be sought in relation to any particular matter or circumstance. The information herein may link to or integrate with other websites operated or content provided by third parties, and such other websites may link to this website. Moonbeam Foundation has no control over any such other websites or their content and will have no liability arising out of or related to such websites or their content. The existence of any such link does not constitute an endorsement of such websites, the content of the websites, or the operators of the websites. These links are being provided to you only as a convenience and you release and hold Moonbeam Foundation harmless from any and all liability arising from your use of this information or the information provided by any third-party website or service.
Last update: May 14, 2024
| Created: April 5, 2022