Starknet Mapping
Starknet Mapping
Mapping functions define how chain data is transformed into the optimised GraphQL entities that we have previously defined in the schema.graphql
file.
- Mappings are defined in the
src/mappings
directory and are exported as a function. - These mappings are also exported in
src/index.ts
. - The mappings files are reference in
project.ts
under the mapping handlers.
There are different classes of mappings functions for Starknet; Block handlers, Transaction Handlers, and Log Handlers.
Block Handler
You can use block handlers to capture information each time a new block is attached to the chain, e.g. block number. To achieve this, a defined BlockHandler will be called once for every block.
Using block handlers slows your project down as they can be executed with each and every block - only use if you need to.
import { StarknetBlock } from "@subql/types-starknet";
export async function handleBlock(block: StarknetBlock): Promise<void> {
// Create a new BlockEntity with the block hash as it's ID
const record = new BlockEntity(block.blockHash);
record.height = BigInt(block.blockNumber);
await record.save();
}
Transaction Handler
You can use transaction handlers to capture information about each of the transactions in a block. To achieve this, a defined TransactionHandler will be called once for every transaction. You should use Mapping Filters in your manifest to filter transactions to reduce the time it takes to index data and improve mapping performance.
We store the decoded calls in the transaction object, so you can easily access the decoded calls and their args in the transaction object. To distinguish between different calls types, you can use the selector
field in the decoded call object.
import { StarknetTransaction } from "@subql/types-starknet";
export async function handleTransaction(
tx: WithdrawTransaction
): Promise<void> {
logger.info(`New Withdraw transaction at block ${tx.blockNumber}`);
assert(tx.decodedCalls, "No tx decodedCalls");
for (let i = 0; i < tx.decodedCalls.length; i++) {
const call = tx.decodedCalls[i];
// Notice all invoke calls are returned in the decodedCalls, we need to filter out the calls we are interested in
if (
call.selector ===
"0x015511cc3694f64379908437d6d64458dc76d02482052bfb8a5b33a72c054c77" ||
call.selector ===
"0x15511cc3694f64379908437d6d64458dc76d02482052bfb8a5b33a72c054c77"
) {
const withdraw = Withdraw.create({
id: `${tx.hash}_${i}`,
user: tx.from,
// Convert the BigNumberish to Hex
token: convertBigNumberish(call.decodedArgs.token),
amount: BigInt(call.decodedArgs.amount),
});
logger.info(`withdraw at ${withdraw.id}`);
await withdraw.save();
}
}
}
Log Handler
You can use log handlers to capture information when certain logs are included on transactions. During the processing, the log handler will receive a log as an argument with the log's typed inputs and outputs. Any type of event will trigger the mapping, allowing activity with the data source to be captured. You should use Mapping Filters in your manifest to filter events to reduce the time it takes to index data and improve mapping performance.
import { StarknetLog } from "@subql/types-starknet";
type DespositEvent = {
user: BigNumberish;
token: BigNumberish;
face_amount: string;
};
type DespositArgs = {
"zklend::market::Market::Deposit": DespositEvent;
block_hash: string;
block_number: number;
transaction_hash: string;
};
type DepositLog = StarknetLog<DespositArgs>;
export async function handleLog(log: DepositLog): Promise<void> {
logger.info(`New deposit event at block ${log.blockNumber}`);
assert(log.args, "No log.args");
const event = log.args["zklend::market::Market::Deposit"];
const user = convertBigNumberish(event.user);
const token = convertBigNumberish(event.token);
const deposit = Deposit.create({
id: `${log.transactionHash}_${user}`,
token: token,
amount: BigInt(event.face_amount),
user: user,
});
logger.info(`deposit ${deposit.id}`);
logger.info(`token ${token}, amount ${deposit.amount}`);
await deposit.save();
}
Querying Contracts
We globally provide an api
object that implements an starknet.js Provider. This will allow querying contract state at the current block height being indexed.
You can then query contract state at the right block height. For example to query the token balance of a user at the current indexed block height:
// Create an instance of the contract, you can get the contract address from the Transaction or Log
// initialize deployed contract
const testAddress =
"0x7667469b8e93faa642573078b6bf8c790d3a6184b2a1bb39c5c923a732862e1";
const compiledTest = json.parse(
fs.readFileSync("./compiledContracts/test.json").toString("ascii")
);
// connect the contract
const myTestContract = new Contract(compiledTest.abi, testAddress, provider);
// Query the balance of an address
const bal1 = await myTestContract.get_balance();
logger.info(`Initial balance: ${bal1.toString()}`); // Cairo 1 contract
Third-party Library Support - the Sandbox
SubQuery is deterministic by design, that means that each SubQuery project is guaranteed to index the same data set. This is a critical factor that is makes it possible to verify data in the decentralised SubQuery Network. This limitation means that in default configuration, the indexer is by default run in a strict virtual machine, with access to a strict number of third party libraries.
You can easily bypass this limitation however, allowing you to retrieve data from external API endpoints, non historical RPC calls, and import your own external libraries into your projects. In order to do to, you must run your project in unsafe
mode, you can read more about this in the references. An easy way to do this while developing (and running in Docker) is to add the following line to your docker-compose.yml
:
subquery-node:
image: onfinality/subql-node-starknet:latest
...
command:
- -f=/app
- --db-schema=app
- --unsafe
...
When run in unsafe
mode, you can import any custom libraries into your project and make external API calls using tools like node-fetch. A simple example is given below:
import { StarknetTransaction } from "@subql/types-starknet";
import fetch from "node-fetch";
export async function handleTransaction(
tx: StarknetTransaction
): Promise<void> {
const httpData = await fetch("https://api.github.com/users/github");
logger.info(`httpData: ${JSON.stringify(httpData.body)}`);
// Do something with this data
}
By default (when in safe mode), the VM2 sandbox only allows the following:
- only some certain built-in modules, e.g.
assert
,buffer
,crypto
,util
andpath
- third-party libraries written by CommonJS.
- external
HTTP
andWebSocket
connections are forbidden
Modules and Libraries
To improve SubQuery's data processing capabilities, we have allowed some of the NodeJS's built-in modules for running mapping functions in the sandbox, and have allowed users to call third-party libraries.
Please note this is an experimental feature and you may encounter bugs or issues that may negatively impact your mapping functions. Please report any bugs you find by creating an issue in GitHub.
Built-in modules
Currently, we allow the following NodeJS modules: assert
, buffer
, crypto
, util
, and path
.
Rather than importing the whole module, we recommend only importing the required method(s) that you need. Some methods in these modules may have dependencies that are unsupported and will fail on import.