Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Squid Questions #6

Open
vikiival opened this issue Jan 23, 2023 · 0 comments
Open

Squid Questions #6

vikiival opened this issue Jan 23, 2023 · 0 comments

Comments

@vikiival
Copy link
Owner

vikiival commented Jan 23, 2023

As soon as, I would love to have MVP.
However, some things are slowing me down.

1. Architecture choice

a. Time Warp

The ideal solution we have described with @Maar-io is that we have marketplace contract that should emit Register(collection: Account32, medatada: BoundedVec<u8>)

When this event is registered by the SubSquid indexer it should spawn a new SubIndexer that would fetch this (PSP34) contract.

However, as I mentioned in #3. The only indexer I have seen is from Talisman, and the architecture suffers a lot.

b. Factory

The second possible solution is to abandon any smart contracts outside the Marketplace Factory contract.

However, this has a few issues:

  • There is currently no filtering for events (Something that EVM has)
  • Since BatchProcessor has terrible DX (sorry to be harsh), It spawns many unnecessary logs that need to be manually processed.

Quick 🩹 : skip using the batch processor. (Faster to implement/ slower for indexing)

Still missing filter, as said in #4

c. Dima's dirty hacking

As I already spoke with @dzhelezov (15 Dec 2022 at 17:27:28), he suggested:

I'd rather do the following.

  1. make a separate job/squid that outputs a list of contracts to track
  2. once a day start a new squid which indexes this contracts from scratch
  3. once done, switch the endpoint

That means

I suggest a cron job that does sqd deploy , sqd prod and sqd kill

so once a day: a json with contracts is generated. A new squid is deployed, it indexes only the contracts in the JSON (it is read on init)

when it's in sync, the old squid is killed, and the alias is switched to the new one

This solution can break in three parts:

  • Since we need two indexers, something can be out of sync.
  • Maybe If there are too many contracts, can it happen that the indexer won't be able to sync?
  • What if some parts of the cron job fail?
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant