The getProgramAccounts function is the most commonly used NFT query in web3, but we only need a part of the data.
Like the official Metaplex code, written very bad, data are the whole network crawl, not suitable for commercial use. That if we need to query plus flexible conditions how to do, this time the web tool memcMP comes, can query data according to the location of bytes.
In my last article, THERE were some basic ways to get account data. Solana DApp developed a way to get various account data
However, we need to use complex queries when the data binding to account is not pure (mainly because the accountInfo function has limited data).
About Conditional Query
MyBidderPot example
First decode using BidderPotParser;
SCHEME
[
BidderPot,
{
kind: 'struct'.fields: [['bidderPot'.'pubkeyAsString'],
['bidderAct'.'pubkeyAsString'],
['auctionAct'.'pubkeyAsString'],
['emptied'.'u8']],},]Copy the code
The filter example
filters: [
{
dataSize: BIDDER_POT_LEN,
},
{
memcmp: {
offset: 0.bytes: '7725SAfE76PUNuXReZZvn8xSBEn1K65bzgnCAvNuEUP3',}},],Copy the code
DataSize is the amount of space taken up
Memcmp is the data memory match, offset is the starting point, do not need to end, end bytes plus offset. Bytes do not need to be encoded because the encoded data does not match (fragment data encoding and full data encoding are completely incompatible).
For offset, it starts at 0.
The other fields start to be calculated according to SCHEME. For example, to query auctionAct information, offset is 32 + 32, or 64. (-1 is not required).
Multiple conditional query complete examples
import { Connection } from '@solana/web3.js';
import { getProgramAccounts } from 'contexts/meta/loadAccounts';
import { AUCTION_ID, BIDDER_POT_LEN } from 'npms/oystoer';
interface myBidderPotProps {
connection: Connection;
walletPubkey: string;
auctionPubkey: string;
}
export const getMyBidderPot = async ({
connection,
walletPubkey,
auctionPubkey,
}: myBidderPotProps) => {
const biderPot = await getProgramAccounts(connection, AUCTION_ID, {
filters: [{dataSize: BIDDER_POT_LEN,
},
{
memcmp: {
offset: 32.// bidderAct
bytes: walletPubkey,
},
},
{
memcmp: {
offset: 32 + 32.// auctionAct
bytes: auctionPubkey,
},
},
],
});
return biderPot;
};
Copy the code
Query Edition by account
Solana Web3 accountInfo masteredtion address can be queried (using JSON, base64 not available), but edtion can not be queried
Existing open source metaplex does this
// Use getProgramAccounts according to METADATA_PROGRAM_ID data
getProgramAccounts(connection, METADATA_PROGRAM_ID).then(
forEach(processMetaData),
)
// After traversal, the front-end filters buffer data
const isEditionV1Account = (account: AccountInfo<Buffer>) = >
account.data[0] === MetadataKey.EditionV1;
// This is the edtion data format
export class Edition {
key: MetadataKey;
/// Points at MasterEdition struct
parent: StringPublicKey;
/// Starting at 0 for master record, this is incremented for each edition minted.
edition: BN;
constructor(args: { key: MetadataKey; parent: StringPublicKey; edition: BN; }) {
this.key = MetadataKey.EditionV1;
this.parent = args.parent;
this.edition = args.edition; }}// This is in SCHEME format
[
Edition,
{
kind: 'struct'.fields: [['key'.'u8'],
['parent'.'pubkeyAsString'],
['edition'.'u64']],}]Copy the code
Offset is the size of u8, which is equal to 1.
So we can write filter like this
filters: [
{
memcmp: {
offset: 1.bytes: parentPubkey,
},
},
]
Copy the code
We can obtain the list of edtion data, but this does not necessarily mean that it is the edtion data, because the judgment of the header byte is missing, but this judgment is not quite the same as the above judgment, how to complete this?
There are two kinds of filtering. One is front-end filtering, which is the simplest, because there are not many data after filtering, so it is not a big problem. 2 is handled directly by Web3, which provides the node. Let’s talk about this case.
The following Buffer is actually base64 or BSE58 decoded data, which can be directly correlated with the query. Therefore, account.data[0] represents the contents of the first field in the schema. We need to query the edtion schema, which is the key value, so we can add the condition offset: 0. The filters code is as follows:
const isEditionV1Account = (account: AccountInfo<Buffer>) = >
account.data[0] === MetadataKey.EditionV1;
Copy the code
filters: [
{
memcmp: {
offset: 0.bytes: bs58.encode(
Buffer.from(MetadataKey.EditionV1.toString(), 'utf8'),),},} {memcmp: {
offset: 1.bytes: parentPubkey,
},
},
]
export enum MetadataKey {
Uninitialized = 0,
MetadataV1 = 4,
EditionV1 = 1,
MasterEditionV1 = 2,
MasterEditionV2 = 6,
EditionMarker = 7,}Copy the code
Complete function code
export const accountGetEditionInfo = async ({
parentPubkey,
connection,
}: {
parentPubkey: string; connection: Connection; = > {})const res = await getProgramAccounts(connection, METADATA_PROGRAM_ID, {
filters: [{memcmp: {
offset: 1.bytes: parentPubkey,
},
},
],
});
returnres? .filter(e= > e.account.data[0] === MetadataKey.EditionV1) ?? [];
};
Copy the code
In the end, the offset is key, which can not be solved. In fact, the second downgrade scheme is adopted. The specific cause of the problem is unknown, please inform us if you know. This place still uses the front end filtering, after all, only a small part of the data will just overlap, the amount of data is not very large.
Memory size
Table of space occupied by base types
type | Memory size |
---|---|
pubkeyAsString | 32 |
u64 | 8 |
u8 | 1 |
thanks
- Thanks to Solana group friend Fun for her kind help
– the –