Mapping
Mapping
Le funzioni di mappatura definiscono come i dati della catena vengono trasformati nelle entità GraphQL ottimizzate che abbiamo precedentemente definito nel file schema.graphql
.
- Le mappature sono definite nella directory
src/mappings
e vengono esportate come una funzione - These mappings are also exported in
src/index.ts
- I file di mappatura sono referenziati in
project.yaml
sotto i gestori di mappatura.
There are three classes of mappings functions; Block handlers, Event Handlers, and Call Handlers.
Block Handler
I file di mappatura sono referenziati in <0>project.yaml</0> sotto i gestori di mappatura. Per ottenere ciò, un BlockHandler definito verrà chiamato una volta per ogni blocco.
import { SubstrateBlock } from "@subql/types";
export async function handleBlock(block: SubstrateBlock): Promise<void> {
// Create a new StarterEntity with the block hash as it's ID
const record = new starterEntity(block.block.header.hash.toString());
record.field1 = block.block.header.number.toNumber();
await record.save();
}
A SubstrateBlock is an extended interface type of signedBlock, but also includes the specVersion
and timestamp
.
Event Handler
È possibile utilizzare gestori di eventi per acquisire informazioni quando determinati eventi vengono inclusi in un nuovo blocco. Gli eventi che fanno parte del runtime Substrate predefinito e un blocco possono contenere più eventi.
Durante l'elaborazione, il gestore di eventi riceverà un evento di substrato come argomento con gli input e gli output tipizzati dell'evento. Qualsiasi tipo di evento attiverà la mappatura, consentendo l'acquisizione dell'attività con l'origine dati. You should use Mapping Filters in your manifest to filter events to reduce the time it takes to index data and improve mapping performance.
import {SubstrateEvent} from "@subql/types";
export async function handleEvent(event: SubstrateEvent): Promise<void> {
const {event: {data: [account, balance]}} = event;
// Retrieve the record by its ID
const record = new starterEntity(event.extrinsic.block.block.header.hash.toString());
record.field2 = account.toString();
record.field3 = (balance as Balance).toBigInt();
await record.save();
A SubstrateEvent is an extended interface type of the EventRecord. Oltre ai dati dell'evento, include anche un id
(il blocco a cui appartiene questo evento) e l'estrinseco all'interno di questo blocco.
Call Handler
I gestori di chiamata vengono utilizzati quando si desidera acquisire informazioni su determinati elementi estrinseci del substrato.
export async function handleCall(extrinsic: SubstrateExtrinsic): Promise<void> {
const record = new starterEntity(
extrinsic.block.block.header.hash.toString()
);
record.field4 = extrinsic.block.timestamp;
await record.save();
}
The SubstrateExtrinsic extends GenericExtrinsic. Viene assegnato un id
(il blocco a cui appartiene questo estrinseco) e fornisce una proprietà estrinseca che estende gli eventi tra questo blocco. Additionally, it records the success status of this extrinsic.
Query States
Il nostro obiettivo è coprire tutte le origini dati per gli utenti per i gestori di mappatura (più dei soli tre tipi di eventi dell'interfaccia sopra). Pertanto, abbiamo esposto alcune delle interfacce @polkadot/api per aumentare le capacità.
Queste sono le interfacce che attualmente supportiamo:
- api.query.<module>.<method>() will query the current block.
- api.query.<module>.<method>.multi() will make multiple queries of the same type at the current block.
- api.queryMulti() will make multiple queries of different types at the current block.
Queste sono le interfacce che attualmente NON supportiamo:
api.tx.*api.derive.*api.query.<module>.<method>.atapi.query.<module>.<method>.entriesAtapi.query.<module>.<method>.entriesPagedapi.query.<module>.<method>.hashapi.query.<module>.<method>.keysAtapi.query.<module>.<method>.keysPagedapi.query.<module>.<method>.rangeapi.query.<module>.<method>.sizeAt
See an example of using this API in our validator-threshold example use case.
RPC calls
We also support some API RPC methods that are remote calls that allow the mapping function to interact with the actual node, query, and submission. A core premise of SubQuery is that it's deterministic, and therefore, to keep the results consistent we only allow historical RPC calls.
Documents in JSON-RPC provide some methods that take BlockHash
as an input parameter (e.g. at?: BlockHash
), which are now permitted. We have also modified these methods to take the current indexing block hash by default.
// Let's say we are currently indexing a block with this hash number
const blockhash = `0x844047c4cf1719ba6d54891e92c071a41e3dfe789d064871148e9d41ef086f6a`;
// Original method has an optional input is block hash
const b1 = await api.rpc.chain.getBlock(blockhash);
// It will use the current block has by default like so
const b2 = await api.rpc.chain.getBlock();
- For Custom Substrate Chains RPC calls, see usage.
Modules and Libraries
To improve SubQuery's data processing capabilities, we have allowed some of the NodeJS's built-in modules for running mapping functions in the sandbox, and have allowed users to call third-party libraries.
Please note this is an experimental feature and you may encounter bugs or issues that may negatively impact your mapping functions. Please report any bugs you find by creating an issue in GitHub.
Built-in modules
Currently, we allow the following NodeJS modules: assert
, buffer
, crypto
, util
, and path
.
Rather than importing the whole module, we recommend only importing the required method(s) that you need. Some methods in these modules may have dependencies that are unsupported and will fail on import.
import { hashMessage } from "ethers/lib/utils"; // Good way
import { utils } from "ethers"; // Bad way
Third-party libraries
A causa delle limitazioni della macchina virtuale nella nostra sandbox, attualmente supportiamo solo librerie di terze parti scritte da CommonJS.
We also support a hybrid library like @polkadot/*
that uses ESM as default. However, if any other libraries depend on any modules in ESM format, the virtual machine will NOT compile and return an error.
Custom Substrate Chains
SubQuery può essere utilizzato su qualsiasi catena basata su Substrate, non solo Polkadot o Kusama.
Puoi utilizzare una catena personalizzata basata su Substrate e forniamo strumenti per importare automaticamente tipi, interfacce e metodi aggiuntivi utilizzando @polkadot/typegen.
In the following sections, we use our kitty example to explain the integration process.
Preparation
Create a new directory api-interfaces
under the project src
folder to store all required and generated files. We also create an api-interfaces/kitties
directory as we want to add decoration in the API from the kitties
module.
Metadata
We need metadata to generate the actual API endpoints. In the kitty example, we use an endpoint from a local testnet, and it provides additional types. Follow the steps in PolkadotJS metadata setup to retrieve a node's metadata from its HTTP endpoint.
curl -H "Content-Type: application/json" -d '{"id":"1", "jsonrpc":"2.0", "method": "state_getMetadata", "params":[]}' http://localhost:9933
or from its websocket endpoint with help from websocat
:
//Install the websocat
brew install websocat
//Get metadata
echo state_getMetadata | websocat 'ws://127.0.0.1:9944' --jsonrpc
Next, copy and paste the output to a JSON file. In our kitty example, we have created api-interface/kitty.json
.
Type definitions
We assume that the user knows the specific types and RPC support from the chain, and it is defined in the Manifest.
Following types setup, we create :
src/api-interfaces/definitions.ts
- this exports all the sub-folder definitions
export { default as kitties } from "./kitties/definitions";
src/api-interfaces/kitties/definitions.ts
- type definitions for the kitties module
export default {
// custom types
types: {
Address: "AccountId",
LookupSource: "AccountId",
KittyIndex: "u32",
Kitty: "[u8; 16]",
},
// custom rpc : api.rpc.kitties.getKittyPrice
rpc: {
getKittyPrice: {
description: "Get Kitty price",
params: [
{
name: "at",
type: "BlockHash",
isHistoric: true,
isOptional: false,
},
{
name: "kittyIndex",
type: "KittyIndex",
isOptional: false,
},
],
type: "Balance",
},
},
};
Packages
- In the
package.json
file, make sure to add@polkadot/typegen
as a development dependency and@polkadot/api
as a regular dependency (ideally the same version). We also needts-node
as a development dependency to help us run the scripts. - We add scripts to run both types;
generate:defs
and metadatagenerate:meta
generators (in that order, so metadata can use the types).
Here is a simplified version of package.json
. Make sure in the scripts section the package name is correct and the directories are valid.
{
"name": "kitty-birthinfo",
"scripts": {
"generate:defs": "ts-node --skip-project node_modules/.bin/polkadot-types-from-defs --package kitty-birthinfo/api-interfaces --input ./src/api-interfaces",
"generate:meta": "ts-node --skip-project node_modules/.bin/polkadot-types-from-chain --package kitty-birthinfo/api-interfaces --endpoint ./src/api-interfaces/kitty.json --output ./src/api-interfaces --strict"
},
"dependencies": {
"@polkadot/api": "^4.9.2"
},
"devDependencies": {
"typescript": "^4.1.3",
"@polkadot/typegen": "^4.9.2",
"ts-node": "^8.6.2"
}
}
Type generation
Now that preparation is completed, we are ready to generate types and metadata. Run the commands below:
# Yarn to install new dependencies
yarn
# Generate types
yarn generate:defs
In each modules folder (eg /kitties
), there should now be a generated types.ts
that defines all interfaces from this modules' definitions, also a file index.ts
that exports them all.
# Generate metadata
yarn generate:meta
This command will generate the metadata and a new api-augment for the APIs. As we don't want to use the built-in API, we will need to replace them by adding an explicit override in our tsconfig.json
. After the updates, the paths in the config will look like this (without the comments):
{
"compilerOptions": {
// this is the package name we use (in the interface imports, --package for generators) */
"kitty-birthinfo/*": ["src/*"],
// here we replace the @polkadot/api augmentation with our own, generated from chain
"@polkadot/api/augment": ["src/interfaces/augment-api.ts"],
// replace the augmented types with our own, as generated from definitions
"@polkadot/types/augment": ["src/interfaces/augment-types.ts"]
}
}
Usage
Now in the mapping function, we can show how the metadata and types actually decorate the API. The RPC endpoint will support the modules and methods we declared above. And to use custom rpc call, please see section Custom chain rpc calls
export async function kittyApiHandler(): Promise<void> {
//return the KittyIndex type
const nextKittyId = await api.query.kitties.nextKittyId();
// return the Kitty type, input parameters types are AccountId and KittyIndex
const allKitties = await api.query.kitties.kitties("xxxxxxxxx", 123);
logger.info(`Next kitty id ${nextKittyId}`);
//Custom rpc, set undefined to blockhash
const kittyPrice = await api.rpc.kitties.getKittyPrice(
undefined,
nextKittyId
);
}
If you wish to publish this project to our explorer, please include the generated files in src/api-interfaces
.
Custom chain rpc calls
To support customised chain RPC calls, we must manually inject RPC definitions for typesBundle
, allowing per-spec configuration. You can define the typesBundle
in the project.yml
. And please remember only isHistoric
type of calls are supported.
...
types: {
"KittyIndex": "u32",
"Kitty": "[u8; 16]",
}
typesBundle: {
spec: {
chainname: {
rpc: {
kitties: {
getKittyPrice:{
description: string,
params: [
{
name: 'at',
type: 'BlockHash',
isHistoric: true,
isOptional: false
},
{
name: 'kittyIndex',
type: 'KittyIndex',
isOptional: false
}
],
type: "Balance",
}
}
}
}
}
}