Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: NFT collection trigger refetch Admin API endpoint #10263

Open
wants to merge 7 commits into
base: master
Choose a base branch
from

Conversation

vbaranov
Copy link
Member

@vbaranov vbaranov commented Jun 17, 2024

Resolves #10520

Motivation

The Admin API endpoint to re-fetch NFT collection metadata is added.

Changelog

Docs update PR blockscout/docs#362.

A new endpoint for triggering NFT collection metadata refetch:

PATCH /api/v2/tokens/:token_contract_address_hash/instances/refetch-metadata

As every admin-related endpoint, requires API key to be sent in the body of the request:

{
    "api_key": "123"
}

AI agent summary

This pull request includes several changes to the BlockScoutWeb and Explorer modules, primarily focusing on refactoring the handling of token instances and adding new functionality for NFT collection metadata refetching. The most important changes are grouped by theme below:

Refactoring Token Instance Handling

  • Updated various controllers to use Instance module methods instead of Chain module methods for handling token instances. (apps/block_scout_web/lib/block_scout_web/controllers/api/v2/token_controller.ex, apps/block_scout_web/lib/block_scout_web/controllers/tokens/instance/holder_controller.ex, apps/block_scout_web/lib/block_scout_web/controllers/tokens/instance/metadata_controller.ex, apps/block_scout_web/lib/block_scout_web/controllers/tokens/instance/transfer_controller.ex, apps/block_scout_web/lib/block_scout_web/controllers/tokens/inventory_controller.ex) [1] [2] [3] [4] [5] [6] [7] [8]

Adding NFT Collection Metadata Refetching

  • Introduced a new endpoint and method for triggering NFT collection metadata refetching on demand. (apps/block_scout_web/lib/block_scout_web/controllers/api/v2/token_controller.ex, apps/block_scout_web/lib/block_scout_web/routers/tokens_api_v2_router.ex) [1] [2]

Testing Enhancements

  • Added tests for the new NFT collection metadata refetching functionality. (apps/block_scout_web/test/block_scout_web/controllers/api/v2/token_controller_test.exs)

Code Cleanup

  • Removed obsolete methods related to token instance handling from the Chain module. (apps/explorer/lib/explorer/chain.ex) [1] [2] [3]

Alias Additions

  • Added necessary aliases for the Instance module in various files to support the refactoring changes. (apps/block_scout_web/lib/block_scout_web/controllers/api/v2/token_controller.ex, apps/block_scout_web/lib/block_scout_web/controllers/tokens/instance/holder_controller.ex, apps/block_scout_web/lib/block_scout_web/controllers/tokens/instance/metadata_controller.ex, apps/block_scout_web/lib/block_scout_web/controllers/tokens/instance/transfer_controller.ex, apps/block_scout_web/lib/block_scout_web/controllers/tokens/inventory_controller.ex, apps/block_scout_web/lib/block_scout_web/views/api/v2/address_view.ex, apps/block_scout_web/test/block_scout_web/controllers/api/v2/token_controller_test.exs) [1] [2] [3] [4] [5] [6] [7]

Checklist for your Pull Request (PR)

Summary by CodeRabbit

  • New Features

    • Introduced an API endpoint to trigger an on-demand refresh of NFT collection metadata, empowering users to update token details as needed.
    • Added functionality for fetching token instances marked for refetching, enhancing data management.
  • Improvements

    • Enhanced the retrieval and display of token information with more dynamic handling of blockchain-specific details.
    • Strengthened security with improved API key validation for metadata refresh operations.
    • Enabled flexible configuration of token metadata refresh processes for better performance and reliability.

@vbaranov vbaranov force-pushed the vb-nft-collection-trigger-metadata-refetch-admin-api-endpoint branch from 113ef84 to a159833 Compare June 17, 2024 14:47
@vbaranov vbaranov self-assigned this Jun 17, 2024
@vbaranov vbaranov force-pushed the vb-nft-collection-trigger-metadata-refetch-admin-api-endpoint branch from a159833 to defadbf Compare July 15, 2024 16:45
@vbaranov vbaranov marked this pull request as ready for review July 15, 2024 16:50
@vbaranov vbaranov force-pushed the vb-nft-collection-trigger-metadata-refetch-admin-api-endpoint branch from defadbf to 9745d66 Compare August 8, 2024 11:14
@vbaranov vbaranov marked this pull request as draft August 12, 2024 18:57
@vbaranov vbaranov force-pushed the vb-nft-collection-trigger-metadata-refetch-admin-api-endpoint branch from 9745d66 to 502fbb0 Compare January 2, 2025 12:44
Copy link

coderabbitai bot commented Jan 2, 2025

Walkthrough

This pull request refactors how NFT token instances and metadata are managed across the BlockScout web interface, Explorer, and Indexer components. Function calls previously routed through the Chain module are updated to use the Instance module. A new API endpoint is added to trigger NFT collection metadata refetch, which validates an API key before processing. Additionally, the pull request introduces new modules and configuration settings to support supervised refetching, along with enhanced tests to verify the new behaviors and error handling.

Changes

File(s) Change Summary
apps/block_scout_web/lib/block_scout_web/controllers/.../token_controller.ex,
.../tokens/instance/{holder,metadata,transfer}.ex,
.../tokens/inventory_controller.ex,
.../routers/tokens_api_v2_router.ex,
.../views/api/v2/address_view.ex
Updated function calls from Chain.* to Instance.*, added new method trigger_nft_collection_metadata_refetch with API key validation, and introduced a new PATCH route to trigger metadata refetch.
apps/block_scout_web/test/.../token_controller_test.exs Added tests verifying the metadata refetch trigger behavior and API key protection.
apps/explorer/lib/explorer/chain.ex,
.../chain/token/instance.ex,
apps/explorer/test/.../token/instance_test.exs,
apps/explorer/test/.../chain_test.exs
Refactored token instance management by removing legacy functions, updating function calls to use the Instance module, and adding new streaming/upsert functionalities and tests.
apps/indexer/lib/indexer/{application.ex,fetcher/token_instance/{helper.ex,realtime.ex,refetch.ex,retry.ex,sanitize.ex},fetcher/on_demand/nft_collection_metadata_refetch.ex} Modified indexer supervision and fetching logic by adding new refetch modules for token instances and NFT metadata, updating aliases from Chain to Instance, and expanding pool configuration.
config/runtime.exs,
docker-compose/envs/common-blockscout.env
Introduced new configuration entries and environment variables to control token instance refetching, including batch size and concurrency parameters.

Assessment against linked issues

Objective Addressed Explanation
NFTs: Refresh collection handle with API key protection and metadata re-fetch logic (#10520)

Suggested reviewers

  • nikitosing
  • sl1depengwyn
  • fedor-ivn

📜 Recent review details

Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between b4ed7b0 and 7a30e6c.

📒 Files selected for processing (1)
  • apps/block_scout_web/lib/block_scout_web/routers/tokens_api_v2_router.ex (1 hunks)
🚧 Files skipped from review as they are similar to previous changes (1)
  • apps/block_scout_web/lib/block_scout_web/routers/tokens_api_v2_router.ex
⏰ Context from checks skipped due to timeout of 90000ms (6)
  • GitHub Check: Indexer Tests (default)
  • GitHub Check: Explorer Tests (default)
  • GitHub Check: EthereumJSONRPC Tests (default)
  • GitHub Check: Blockscout Web Tests (default)
  • GitHub Check: Dialyzer static analysis (default)
  • GitHub Check: Analyze (javascript)

Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media?

❤️ Share
🪧 Tips

Chat

There are 3 ways to chat with CodeRabbit:

  • Review comments: Directly reply to a review comment made by CodeRabbit. Example:
    • I pushed a fix in commit <commit_id>, please review it.
    • Generate unit testing code for this file.
    • Open a follow-up GitHub issue for this discussion.
  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbitai in a new review comment at the desired location with your query. Examples:
    • @coderabbitai generate unit testing code for this file.
    • @coderabbitai modularize this function.
  • PR comments: Tag @coderabbitai in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    • @coderabbitai gather interesting stats about this repository and render them as a table. Additionally, render a pie chart showing the language distribution in the codebase.
    • @coderabbitai read src/utils.ts and generate unit testing code.
    • @coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.
    • @coderabbitai help me debug CodeRabbit configuration file.

Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments.

CodeRabbit Commands (Invoked using PR comments)

  • @coderabbitai pause to pause the reviews on a PR.
  • @coderabbitai resume to resume the paused reviews.
  • @coderabbitai review to trigger an incremental review. This is useful when automatic reviews are disabled for the repository.
  • @coderabbitai full review to do a full review from scratch and review all the files again.
  • @coderabbitai summary to regenerate the summary of the PR.
  • @coderabbitai generate docstrings to generate docstrings for this PR. (Beta)
  • @coderabbitai resolve resolve all the CodeRabbit review comments.
  • @coderabbitai configuration to show the current CodeRabbit configuration for the repository.
  • @coderabbitai help to get help.

Other keywords and placeholders

  • Add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.
  • Add @coderabbitai summary to generate the high-level summary at a specific location in the PR description.
  • Add @coderabbitai anywhere in the PR title to generate the title automatically.

CodeRabbit Configuration File (.coderabbit.yaml)

  • You can programmatically configure CodeRabbit by adding a .coderabbit.yaml file to the root of your repository.
  • Please see the configuration documentation for more information.
  • If your editor has YAML language server enabled, you can add the path at the top of this file to enable auto-completion and validation: # yaml-language-server: $schema=https://coderabbit.ai/integrations/schema.v2.json

Documentation and Community

  • Visit our Documentation for detailed information on how to use CodeRabbit.
  • Join our Discord Community to get help, request features, and share feedback.
  • Follow us on X/Twitter for updates and announcements.

@vbaranov vbaranov force-pushed the vb-nft-collection-trigger-metadata-refetch-admin-api-endpoint branch from dae7349 to d217dba Compare January 2, 2025 14:48
@vbaranov vbaranov marked this pull request as ready for review January 2, 2025 14:51
@vbaranov vbaranov requested review from a team and removed request for nikitosing, varasev, fedor-ivn, sl1depengwyn and Qwerty5Uiop January 2, 2025 14:51
@vbaranov vbaranov force-pushed the vb-nft-collection-trigger-metadata-refetch-admin-api-endpoint branch from d217dba to fc73b26 Compare January 2, 2025 14:55
Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Caution

Inline review comments failed to post. This is likely due to GitHub's limits when posting large numbers of comments.

Actionable comments posted: 2

🧹 Nitpick comments (11)
apps/indexer/lib/indexer/fetcher/token_instance/refetch.ex (2)

6-7: Advisory on use Indexer.Fetcher, restart: :permanent.

The permanent restart strategy is appropriate for critical processes. Ensure that transient errors or one-off failures do not cause unintended restarts. A short-lived exponential backoff may help if this process faces external dependencies that can fail intermittently.


38-47: Check error handling in run/2.

Currently, if batch_fetch_instances/1 encounters a failure, it doesn't appear to return an error or halt. Confirm that this is intentional. If partial failures need to be captured or retried, consider implementing robust error handling or logging.

apps/block_scout_web/lib/block_scout_web/controllers/api/v2/token_controller.ex (2)

8-8: Duplicate alias for Explorer.Chain.Token.Instance.

Line 8 aliases Instance, and the subsequent line also aliases it as part of Explorer.Chain.{..., Token.Instance}. This duplicate alias might override or cause confusion. Consider removing one for clarity.

-  alias Explorer.Chain.Token.Instance
  alias Explorer.Chain.{Address, BridgedToken, Token, Token.Instance}

369-388: Security note on trigger_nft_collection_metadata_refetch/2.

It strictly compares params["api_key"] to the configured :sensitive_endpoints_api_key. This is fine for basic protection, but consider hashing or rotating the key for stronger security. If the key is absent or mismatched, the function short-circuits. Confirm that logs do not accidentally expose the key.

apps/explorer/lib/explorer/chain/token/instance.ex (1)

891-902: mark_nft_collection_to_refetch/1 sets metadata to nil and error to the sentinel.

This is a simple bulk approach, but be mindful that large-scale updates could be expensive on large tables. For bigger data sets, consider chunking or pagination to avoid lock contention.

apps/indexer/lib/indexer/fetcher/on_demand/token_metadata_refetch.ex (2)

19-21: fetch_metadata/2 calls mark_nft_collection_to_refetch/1.

This is a minimal approach that unconditionally marks the entire NFT collection for refetch. If partial refetch or selective logic is needed, consider more granular design.


32-37: handle_cast callback in a permanent fetcher process.

The process does not reply to the caller and simply updates the DB state. That's fine for a basic "fire-and-forget" design. Consider adding logs or metrics to track successful triggers or errors.

docker-compose/envs/common-blockscout.env (2)

176-176: Consider environment-specific toggles for refetch fetcher.

Enabling the INDEXER_DISABLE_TOKEN_INSTANCE_REFETCH_FETCHER environment variable by default may trigger unintentional re-fetch operations for local development. You might consider toggling it per environment to efficiently control resource usage.


201-202: Uncomment or remove these lines based on usage needs.

These lines hint at possible concurrency/batch configurations. If these features are indeed required, consider uncommenting and assigning meaningful values; otherwise, removing them prevents confusion.

apps/block_scout_web/test/block_scout_web/controllers/api/v2/token_controller_test.exs (1)

1668-1733: Tests for triggering NFT collection metadata refetch are comprehensive.

  1. Success case: Validates the proper clearance of existing token instance metadata, followed by a 200 response.
  2. Unauthorized case: Verifies 401 when no API key is provided.

These tests thoroughly cover positive and negative scenarios. Additionally, consider asserting that no operations occur if an invalid API key is provided (malformed vs. missing).

config/runtime.exs (1)

897-900: Validate concurrency and batch size configuration.

The INDEXER_TOKEN_INSTANCE_REFETCH_CONCURRENCY and INDEXER_TOKEN_INSTANCE_REFETCH_BATCH_SIZE settings give valuable tuning options. Ensure these environment variables are well-documented and comprehensively tested with varying loads to prevent potential bottlenecks or excessive memory usage.

🛑 Comments failed to post (2)
apps/indexer/lib/indexer/fetcher/token_instance/realtime.ex (1)

44-44: 💡 Codebase verification

Parameter order is incorrect in the function call

The function call Instance.token_instance_with_unfetched_metadata?(token_id, hash) has incorrect parameter order. Based on the function definition in apps/explorer/lib/explorer/chain/token/instance.ex, the function expects:

  • First parameter: token_id (✓ correct)
  • Second parameter: token_contract_address (✗ incorrect variable name used)
  • Optional third parameter: options (defaulting to [])

The variable hash should be renamed to token_contract_address for clarity and consistency with the function's signature.

🔗 Analysis chain

Check parameter order for correctness.

Although referencing the new alias ensures clarity, confirm that the arguments token_id, hash in Instance.token_instance_with_unfetched_metadata? match its expected parameter order.

Run the following script to locate the function definition and confirm the parameter order:

🏁 Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Description: Check if token_instance_with_unfetched_metadata? expects token_id, then contract_address_hash.

ast-grep --pattern $'def token_instance_with_unfetched_metadata?(token_id, contract_address_hash) { $$$ }'

Length of output: 107


Script:

#!/bin/bash
# Let's try to find the function definition using ripgrep
rg -p "token_instance_with_unfetched_metadata\?" --type elixir

Length of output: 614


Script:

#!/bin/bash
# Let's see the implementation and context of the function
rg -A 10 "token_instance_with_unfetched_metadata\?" apps/explorer/lib/explorer/chain/token/instance.ex

Length of output: 603

apps/indexer/lib/indexer/fetcher/token_instance/retry.ex (1)

34-34: 🛠️ Refactor suggestion

Potential backlog or queue size considerations.

While streaming token instances with errors, the current trampoline call with :timer.sleep/1 might lead to performance overhead if the queue remains full for extended periods. Consider a backpressure or pushback mechanism to avoid potential busy-wait scenarios.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

🧹 Nitpick comments (7)
docker-compose/envs/common-blockscout.env (1)

202-202: Consider adding a default value for INDEXER_TOKEN_INSTANCE_REFETCH_CONCURRENCY.

Other concurrency settings in the file typically have default values. Adding a default value would help maintain consistency and provide guidance for deployment.

-# INDEXER_TOKEN_INSTANCE_REFETCH_CONCURRENCY=
+# INDEXER_TOKEN_INSTANCE_REFETCH_CONCURRENCY=10
config/runtime.exs (1)

897-900: LGTM! Consider documenting the environment variables.

The configuration for TokenInstance.Refetch is well-structured with appropriate defaults for concurrency and batch size. The implementation follows the established patterns in the codebase.

Consider adding these environment variables to the documentation:

  • INDEXER_TOKEN_INSTANCE_REFETCH_CONCURRENCY (default: 10)
  • INDEXER_TOKEN_INSTANCE_REFETCH_BATCH_SIZE (default: 10)
apps/explorer/test/explorer/chain/token/instance_test.exs (1)

155-208: Consider adding more pagination test cases

While the current test effectively verifies basic pagination functionality, consider adding tests for:

  • Empty result set
  • Last page scenario
  • Maximum page size handling
  • Invalid pagination key handling
apps/indexer/lib/indexer/fetcher/token_instance/refetch.ex (3)

1-4: Documentation could be more descriptive

The @moduledoc could be enhanced to better describe:

  • When/why a token instance is marked for re-fetching
  • The relationship with BufferedTask behavior
  • The expected outcomes of the re-fetch process

Consider expanding the documentation like this:

 @moduledoc """
-  Fetches information about a token instance, which is marked to be re-fetched.
+  Fetches information about token instances that are marked for metadata re-fetching.
+
+  This module implements the `BufferedTask` behavior to efficiently process multiple
+  token instances in batches. It monitors instances flagged for re-fetching and
+  updates their metadata through the batch_fetch_instances process.
+
+  Re-fetching is typically triggered when:
+  - Token metadata is found to be incomplete
+  - Previous fetch attempts failed
+  - Manual refresh is requested via the Admin API
 """

38-47: Add logging for better observability

The run/2 function should log the number of instances being processed and any failures during batch_fetch_instances for better debugging.

 @impl BufferedTask
 def run(token_instances, _) when is_list(token_instances) do
+  Logger.debug("Processing #{length(token_instances)} token instances for refetch")
+
   token_instances
   |> Enum.filter(fn %{contract_address_hash: hash, token_id: token_id} ->
     Instance.token_instance_marked_to_refetch?(token_id, hash)
   end)
-  |> batch_fetch_instances()
+  |> case do
+    [] ->
+      Logger.debug("No instances marked for refetch")
+      :ok
+    instances_to_fetch ->
+      Logger.info("Fetching metadata for #{length(instances_to_fetch)} instances")
+      batch_fetch_instances(instances_to_fetch)
+  end

   :ok
 end

49-57: Consider making configuration more flexible

The defaults function could benefit from more configuration options being externalized to the application config.

 defp defaults do
   [
-    flush_interval: :infinity,
+    flush_interval: Application.get_env(:indexer, __MODULE__)[:flush_interval] || :infinity,
     max_concurrency: Application.get_env(:indexer, __MODULE__)[:concurrency] || @default_max_concurrency,
     max_batch_size: Application.get_env(:indexer, __MODULE__)[:batch_size] || @default_max_batch_size,
-    poll: false,
+    poll: Application.get_env(:indexer, __MODULE__)[:poll] || false,
     task_supervisor: __MODULE__.TaskSupervisor
   ]
 end
apps/explorer/lib/explorer/chain/token/instance.ex (1)

Line range hint 96-128: Query functions are well-structured but could benefit from more documentation.

The query functions are logically organized, but some functions like page_token_instance/2 lack documentation.

Consider adding @doc strings to these functions following the module's documentation format:

@doc """
  Pages token instances based on the provided paging options.

  ## Parameters
  - `query`: The base query to page
  - `paging_options`: The PagingOptions struct containing paging parameters

  ## Returns
  - A modified query with paging conditions applied
"""
def page_token_instance(query, %PagingOptions{key: {token_id}, asc_order: true}) do
  where(query, [i], i.token_id > ^token_id)
end
📜 Review details

Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between d217dba and fc73b26.

📒 Files selected for processing (20)
  • apps/block_scout_web/lib/block_scout_web/controllers/api/v2/token_controller.ex (5 hunks)
  • apps/block_scout_web/lib/block_scout_web/controllers/tokens/instance/holder_controller.ex (2 hunks)
  • apps/block_scout_web/lib/block_scout_web/controllers/tokens/instance/metadata_controller.ex (2 hunks)
  • apps/block_scout_web/lib/block_scout_web/controllers/tokens/instance/transfer_controller.ex (2 hunks)
  • apps/block_scout_web/lib/block_scout_web/controllers/tokens/inventory_controller.ex (2 hunks)
  • apps/block_scout_web/lib/block_scout_web/views/api/v2/address_view.ex (1 hunks)
  • apps/explorer/lib/explorer/chain.ex (0 hunks)
  • apps/explorer/lib/explorer/chain/token/instance.ex (16 hunks)
  • apps/explorer/test/explorer/chain/token/instance_test.exs (2 hunks)
  • apps/explorer/test/explorer/chain_test.exs (0 hunks)
  • apps/indexer/lib/indexer/application.ex (3 hunks)
  • apps/indexer/lib/indexer/fetcher/on_demand/token_metadata_refetch.ex (1 hunks)
  • apps/indexer/lib/indexer/fetcher/token_instance/helper.ex (2 hunks)
  • apps/indexer/lib/indexer/fetcher/token_instance/realtime.ex (1 hunks)
  • apps/indexer/lib/indexer/fetcher/token_instance/refetch.ex (1 hunks)
  • apps/indexer/lib/indexer/fetcher/token_instance/retry.ex (2 hunks)
  • apps/indexer/lib/indexer/fetcher/token_instance/sanitize.ex (3 hunks)
  • apps/indexer/lib/indexer/supervisor.ex (2 hunks)
  • config/runtime.exs (2 hunks)
  • docker-compose/envs/common-blockscout.env (2 hunks)
💤 Files with no reviewable changes (2)
  • apps/explorer/test/explorer/chain_test.exs
  • apps/explorer/lib/explorer/chain.ex
🚧 Files skipped from review as they are similar to previous changes (11)
  • apps/indexer/lib/indexer/fetcher/token_instance/sanitize.ex
  • apps/indexer/lib/indexer/fetcher/token_instance/retry.ex
  • apps/indexer/lib/indexer/fetcher/token_instance/realtime.ex
  • apps/block_scout_web/lib/block_scout_web/controllers/tokens/instance/metadata_controller.ex
  • apps/block_scout_web/lib/block_scout_web/controllers/tokens/instance/holder_controller.ex
  • apps/block_scout_web/lib/block_scout_web/controllers/tokens/inventory_controller.ex
  • apps/block_scout_web/lib/block_scout_web/controllers/tokens/instance/transfer_controller.ex
  • apps/indexer/lib/indexer/fetcher/token_instance/helper.ex
  • apps/indexer/lib/indexer/supervisor.ex
  • apps/indexer/lib/indexer/fetcher/on_demand/token_metadata_refetch.ex
  • apps/block_scout_web/lib/block_scout_web/controllers/api/v2/token_controller.ex
🧰 Additional context used
📓 Path-based instructions (4)
apps/indexer/lib/indexer/application.ex (1)

Pattern **/*.ex: This is a module written in Elixir. For changed functions here are general guidelines for reviewing:

  1. Context Awareness:
  • Before raising a comment about missing specifications (@SPEC) or documentation (@doc), verify whether they exist in the corresponding code.
  • Check if a single @SPEC or @doc covers multiple clauses of a function (e.g., functions with different argument patterns). Do not raise a comment if one shared @SPEC or @doc exists.
  1. Behavior-Implementing Functions:
  • If a function is preceded by @impl, confirm that it implements a behavior. Do not raise comments for missing @SPEC or @doc if:
    • @SPEC is omitted (this is acceptable).
    • Documentation is omitted or provided in #-style comments.
  1. Predefined Functions:
  • For GenServer functions like child_spec/1 and start_link/1 do not raise comments for missing @SPEC or @doc.
  1. Documentation Format:
  • When reviewing @doc comments, ensure that it follows the following format:
@doc """
  This function does X, Y, and Z and this very first sentence of the comment must not be wrapped if the line exceeds 80 chars.

  The next lines in the documentation comment of any function must be wrapped
  after 80 chars (soft limit). It will allow to read documentation comments
  well in the code editors.

  ## Parameters
  - `param1`: Description of param1
  - `param2`: Description of param2

  ## Returns
  - what the function returns in one case
  - what the function returns if there is more than one possible outcome
"""
  1. Exceptions for Private Functions:
  • For private functions:
    • Skip comments about missing @SPEC if the function has no arguments.
    • Ensure no duplicate @SPEC is required for multiple clauses.

Example Scenarios to Reduce False Positives

  1. Single Specification and Documentation for Multiple Clauses (i.e., different implementations based on pattern matching of arguments):
@doc """
  Provides an example function.

  ## Parameters
  - `integer`: An integer input.

  ## Returns
  - A string result.
"""
@spec example_function(integer()) :: String.t()
def example_function(1), do: "One"
def example_function(2), do: "Two"
  • Do not raise a comment here, as both @SPEC and @doc cover all clauses.
  1. Behavior-Implementing Function:
# Handles an example call.
@impl GenServer
def handle_call(:example, _from, state) do
  {:reply, :ok, state}
end
  • Do not raise a comment for missing @SPEC.
  • Ensure the documentation is valid as a # comment.
  1. Function with No Arguments:
defp no_arg_function, do: :ok
  • Do not raise a comment for a missing @SPEC.
apps/block_scout_web/lib/block_scout_web/views/api/v2/address_view.ex (1)

Pattern **/*.ex: This is a module written in Elixir. For changed functions here are general guidelines for reviewing:

  1. Context Awareness:
  • Before raising a comment about missing specifications (@SPEC) or documentation (@doc), verify whether they exist in the corresponding code.
  • Check if a single @SPEC or @doc covers multiple clauses of a function (e.g., functions with different argument patterns). Do not raise a comment if one shared @SPEC or @doc exists.
  1. Behavior-Implementing Functions:
  • If a function is preceded by @impl, confirm that it implements a behavior. Do not raise comments for missing @SPEC or @doc if:
    • @SPEC is omitted (this is acceptable).
    • Documentation is omitted or provided in #-style comments.
  1. Predefined Functions:
  • For GenServer functions like child_spec/1 and start_link/1 do not raise comments for missing @SPEC or @doc.
  1. Documentation Format:
  • When reviewing @doc comments, ensure that it follows the following format:
@doc """
  This function does X, Y, and Z and this very first sentence of the comment must not be wrapped if the line exceeds 80 chars.

  The next lines in the documentation comment of any function must be wrapped
  after 80 chars (soft limit). It will allow to read documentation comments
  well in the code editors.

  ## Parameters
  - `param1`: Description of param1
  - `param2`: Description of param2

  ## Returns
  - what the function returns in one case
  - what the function returns if there is more than one possible outcome
"""
  1. Exceptions for Private Functions:
  • For private functions:
    • Skip comments about missing @SPEC if the function has no arguments.
    • Ensure no duplicate @SPEC is required for multiple clauses.

Example Scenarios to Reduce False Positives

  1. Single Specification and Documentation for Multiple Clauses (i.e., different implementations based on pattern matching of arguments):
@doc """
  Provides an example function.

  ## Parameters
  - `integer`: An integer input.

  ## Returns
  - A string result.
"""
@spec example_function(integer()) :: String.t()
def example_function(1), do: "One"
def example_function(2), do: "Two"
  • Do not raise a comment here, as both @SPEC and @doc cover all clauses.
  1. Behavior-Implementing Function:
# Handles an example call.
@impl GenServer
def handle_call(:example, _from, state) do
  {:reply, :ok, state}
end
  • Do not raise a comment for missing @SPEC.
  • Ensure the documentation is valid as a # comment.
  1. Function with No Arguments:
defp no_arg_function, do: :ok
  • Do not raise a comment for a missing @SPEC.
apps/indexer/lib/indexer/fetcher/token_instance/refetch.ex (1)

Pattern **/*.ex: This is a module written in Elixir. For changed functions here are general guidelines for reviewing:

  1. Context Awareness:
  • Before raising a comment about missing specifications (@SPEC) or documentation (@doc), verify whether they exist in the corresponding code.
  • Check if a single @SPEC or @doc covers multiple clauses of a function (e.g., functions with different argument patterns). Do not raise a comment if one shared @SPEC or @doc exists.
  1. Behavior-Implementing Functions:
  • If a function is preceded by @impl, confirm that it implements a behavior. Do not raise comments for missing @SPEC or @doc if:
    • @SPEC is omitted (this is acceptable).
    • Documentation is omitted or provided in #-style comments.
  1. Predefined Functions:
  • For GenServer functions like child_spec/1 and start_link/1 do not raise comments for missing @SPEC or @doc.
  1. Documentation Format:
  • When reviewing @doc comments, ensure that it follows the following format:
@doc """
  This function does X, Y, and Z and this very first sentence of the comment must not be wrapped if the line exceeds 80 chars.

  The next lines in the documentation comment of any function must be wrapped
  after 80 chars (soft limit). It will allow to read documentation comments
  well in the code editors.

  ## Parameters
  - `param1`: Description of param1
  - `param2`: Description of param2

  ## Returns
  - what the function returns in one case
  - what the function returns if there is more than one possible outcome
"""
  1. Exceptions for Private Functions:
  • For private functions:
    • Skip comments about missing @SPEC if the function has no arguments.
    • Ensure no duplicate @SPEC is required for multiple clauses.

Example Scenarios to Reduce False Positives

  1. Single Specification and Documentation for Multiple Clauses (i.e., different implementations based on pattern matching of arguments):
@doc """
  Provides an example function.

  ## Parameters
  - `integer`: An integer input.

  ## Returns
  - A string result.
"""
@spec example_function(integer()) :: String.t()
def example_function(1), do: "One"
def example_function(2), do: "Two"
  • Do not raise a comment here, as both @SPEC and @doc cover all clauses.
  1. Behavior-Implementing Function:
# Handles an example call.
@impl GenServer
def handle_call(:example, _from, state) do
  {:reply, :ok, state}
end
  • Do not raise a comment for missing @SPEC.
  • Ensure the documentation is valid as a # comment.
  1. Function with No Arguments:
defp no_arg_function, do: :ok
  • Do not raise a comment for a missing @SPEC.
apps/explorer/lib/explorer/chain/token/instance.ex (1)

Pattern **/*.ex: This is a module written in Elixir. For changed functions here are general guidelines for reviewing:

  1. Context Awareness:
  • Before raising a comment about missing specifications (@SPEC) or documentation (@doc), verify whether they exist in the corresponding code.
  • Check if a single @SPEC or @doc covers multiple clauses of a function (e.g., functions with different argument patterns). Do not raise a comment if one shared @SPEC or @doc exists.
  1. Behavior-Implementing Functions:
  • If a function is preceded by @impl, confirm that it implements a behavior. Do not raise comments for missing @SPEC or @doc if:
    • @SPEC is omitted (this is acceptable).
    • Documentation is omitted or provided in #-style comments.
  1. Predefined Functions:
  • For GenServer functions like child_spec/1 and start_link/1 do not raise comments for missing @SPEC or @doc.
  1. Documentation Format:
  • When reviewing @doc comments, ensure that it follows the following format:
@doc """
  This function does X, Y, and Z and this very first sentence of the comment must not be wrapped if the line exceeds 80 chars.

  The next lines in the documentation comment of any function must be wrapped
  after 80 chars (soft limit). It will allow to read documentation comments
  well in the code editors.

  ## Parameters
  - `param1`: Description of param1
  - `param2`: Description of param2

  ## Returns
  - what the function returns in one case
  - what the function returns if there is more than one possible outcome
"""
  1. Exceptions for Private Functions:
  • For private functions:
    • Skip comments about missing @SPEC if the function has no arguments.
    • Ensure no duplicate @SPEC is required for multiple clauses.

Example Scenarios to Reduce False Positives

  1. Single Specification and Documentation for Multiple Clauses (i.e., different implementations based on pattern matching of arguments):
@doc """
  Provides an example function.

  ## Parameters
  - `integer`: An integer input.

  ## Returns
  - A string result.
"""
@spec example_function(integer()) :: String.t()
def example_function(1), do: "One"
def example_function(2), do: "Two"
  • Do not raise a comment here, as both @SPEC and @doc cover all clauses.
  1. Behavior-Implementing Function:
# Handles an example call.
@impl GenServer
def handle_call(:example, _from, state) do
  {:reply, :ok, state}
end
  • Do not raise a comment for missing @SPEC.
  • Ensure the documentation is valid as a # comment.
  1. Function with No Arguments:
defp no_arg_function, do: :ok
  • Do not raise a comment for a missing @SPEC.
🔇 Additional comments (12)
docker-compose/envs/common-blockscout.env (1)

176-176: LGTM! New configuration variables for token instance refetching.

The new environment variables align well with the PR objectives and follow consistent naming patterns with other similar configurations in the file.

Also applies to: 201-202

config/runtime.exs (1)

847-849: LGTM! Supervisor configuration follows established patterns.

The configuration for TokenInstance.Refetch.Supervisor follows the same pattern as other supervisor configurations in the codebase, using the standard environment variable parsing helper.

apps/block_scout_web/lib/block_scout_web/views/api/v2/address_view.ex (1)

Line range hint 219-234: LGTM! Clean refactoring of NFT instance retrieval.

The change from Chain to Instance module is well-implemented with proper error handling and maintains the expected behavior.

Let's verify that all usages of the old Chain.nft_instance_from_token_id_and_token_address have been updated:

✅ Verification successful

All references to NFT instance retrieval have been updated to use the new Instance module

The verification shows that:

  • No references to the old Chain.nft_instance_from_token_id_and_token_address exist in the codebase
  • The new Instance.nft_instance_by_token_id_and_token_address is consistently used across all relevant files:
    • Main implementation in explorer/lib/explorer/chain/token/instance.ex
    • Test coverage in explorer/test/explorer/chain/token/instance_test.exs
    • Usage in API controllers and views:
      • block_scout_web/views/api/v2/address_view.ex
      • block_scout_web/controllers/tokens/instance/*_controller.ex
      • block_scout_web/controllers/api/v2/token_controller.ex
🏁 Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Description: Check for any remaining references to the old Chain module function
# Test: Search for old function name. Expect: No results.
rg "nft_instance_from_token_id_and_token_address"

# Test: Verify consistent usage of new Instance module function
# Expect: All NFT instance retrievals use the new function
rg "nft_instance_by_token_id_and_token_address"

Length of output: 1610

apps/explorer/test/explorer/chain/token/instance_test.exs (2)

6-6: LGTM: PagingOptions alias added for pagination tests

The addition of the PagingOptions alias is appropriate as it's used in the new pagination tests.


85-153: Well-structured test coverage for upsert_token_instance/1

The test cases comprehensively cover:

  • Success path with valid parameters
  • Validation of required fields
  • Error state handling
  • Error state transitions
apps/indexer/lib/indexer/application.ex (2)

36-39: LGTM! Pool size calculation properly integrated

The pool size calculation for the new Refetch fetcher is correctly integrated with the existing calculations.


51-51: Verify supervisor initialization order

The TokenMetadataRefetch supervisor is added to base_children. Ensure there are no dependencies that would require a specific initialization order.

✅ Verification successful

Let me analyze the dependencies further by checking the supervisor implementation and any potential dependencies in the token metadata refetch module.


Let me check the supervisor implementation since we haven't found it in the current file.


Let me check one final thing to see if there are any dependencies in the chain of calls.


Supervisor initialization order is safe

The TokenMetadataRefetch service is a simple GenServer that only depends on the database layer through Explorer.Chain.Token.Instance. It doesn't have any dependencies on other supervisors or services that would require a specific initialization order. The service:

  • Uses GenServer for its implementation
  • Only performs database operations through Ecto
  • Operates independently of other services
🏁 Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Check for potential supervisor dependencies
rg -A 5 "TokenMetadataRefetch" --type elixir

Length of output: 4388


Script:

#!/bin/bash
# Check supervisor implementation and dependencies
ast-grep --pattern 'defmodule $_ do
  $$$
  use Supervisor
  $$$
end' apps/indexer/lib/indexer/fetcher/on_demand/token_metadata_refetch.ex

# Check for any dependencies on other supervisors or services
cat apps/indexer/lib/indexer/fetcher/on_demand/token_metadata_refetch.ex

Length of output: 1113


Script:

#!/bin/bash
# Look for the supervisor implementation
fd "token_metadata_refetch" --type f --exec cat {}

Length of output: 950


Script:

#!/bin/bash
# Check for dependencies in the chain of calls
rg "mark_nft_collection_to_refetch" --type elixir -A 5

Length of output: 1350

apps/explorer/lib/explorer/chain/token/instance.ex (5)

15-22: Well-structured module attributes and types!

The new constants and types are well-defined:

  • Default page size and paging options are set to reasonable values
  • The api? type is properly defined for API-related function options
  • The @marked_to_refetch constant is appropriately defined for the metadata refetch feature

Line range hint 67-85: Good use of __MODULE__ for better maintainability!

The changeset function has been updated to use __MODULE__ instead of hardcoding the module name, which improves maintainability.


1072-1083: Well-implemented token instance fetching with proper preloading!

The address_to_unique_tokens/3 function is well-structured with:

  • Proper use of paging options with defaults
  • Efficient preloading of associations
  • Clear mapping of results with owner information

881-988: Well-implemented metadata refetch functionality with proper error handling!

The new functions for metadata refetch are well-structured with:

  • Clear documentation following the module's format
  • Proper error handling and status tracking
  • Efficient database queries

1157-1220: Sophisticated retry mechanism with potential edge case.

The retry logic is well-implemented with exponential backoff, but there's a potential edge case to consider.

The retry mechanism uses exponential backoff with a maximum retry count, but we should verify the behavior when approaching the maximum value of a smallint (32767).

Additionally, consider adding a comment explaining the exponential backoff formula:

# The retry interval is calculated as: coef * base^min(retries_count + 1, max_retry_count)
# where max_retry_count is derived from max_refetch_interval to prevent overflow
✅ Verification successful

Let me gather more information about the retry mechanism implementation.


Let me check the migration files to understand how retries_count is defined in the database schema.


The retry mechanism is safe from integer overflow

The concern about hitting the smallint limit is already addressed in the codebase:

  1. The retries_count is defined as a :smallint in the database schema, which has a range of -32768 to 32767
  2. The code defines @max_retries_count_value 32767 and uses it with the error_to_ban_interval mapping
  3. The token instance becomes banned when retries_count + 1 > max_retries_count_before_ban, which prevents further retries
  4. The exponential backoff formula is bounded by max_retry_count which is derived from max_refetch_interval

These safeguards ensure that:

  • The retries will stop well before reaching the smallint limit
  • The instance will be marked as banned, preventing further increment of retries_count
  • The exponential backoff is capped by both the retry count and maximum interval
🏁 Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Description: Check if there are any instances close to max retries
# that might hit the smallint limit

rg -A 5 "retries_count.*32767" .

Length of output: 496


Script:

#!/bin/bash
# Check the schema definition for retries_count
ast-grep --pattern 'field :retries_count, $_'

# Check if there are any constraints or validations on retries_count
rg -B2 -A2 'retries_count.*validate' .

# Look for migrations related to retries_count
fd -e exs -x grep -l 'retries_count' {}

Length of output: 674


Script:

#!/bin/bash
# Check the migration files content
cat ./apps/explorer/priv/repo/migrations/20240503091708_add_nft_instance_fetcher_aux_fields.exs
cat ./apps/explorer/priv/repo/migrations/20240828140638_add_token_balance_retry_fields.exs

Length of output: 720

Comment on lines 210 to 159
describe "nft_instance_by_token_id_and_token_address/2" do
test "return NFT instance" do
token = insert(:token)

token_id = 10

insert(:token_instance,
token_contract_address_hash: token.contract_address_hash,
token_id: token_id
)

assert {:ok, result} =
Chain.nft_instance_by_token_id_and_token_address(
token_id,
token.contract_address_hash
)

assert result.token_id == Decimal.new(token_id)
end
end
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Add error case tests for NFT instance retrieval

The current test only covers the successful retrieval case. Consider adding tests for:

  • Non-existent token_id
  • Invalid token contract address
  • Invalid token_id format (non-numeric)

Example structure for additional tests:

test "returns error for non-existent token_id" do
  token = insert(:token)
  token_id = 999

  assert {:error, :not_found} =
           Chain.nft_instance_by_token_id_and_token_address(
             token_id,
             token.contract_address_hash
           )
end

Comment on lines +28 to +36
@impl BufferedTask
def init(initial_acc, reducer, _) do
{:ok, acc} =
Instance.stream_token_instances_marked_to_refetch(initial_acc, fn data, acc ->
reducer.(data, acc)
end)

acc
end
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Consider adding error handling for stream_token_instances_marked_to_refetch

The init/3 function assumes the stream operation will always succeed. Consider handling potential database errors.

 @impl BufferedTask
 def init(initial_acc, reducer, _) do
-  {:ok, acc} =
-    Instance.stream_token_instances_marked_to_refetch(initial_acc, fn data, acc ->
-      reducer.(data, acc)
-    end)
+  case Instance.stream_token_instances_marked_to_refetch(initial_acc, fn data, acc ->
+    reducer.(data, acc)
+  end) do
+    {:ok, acc} -> acc
+    {:error, reason} ->
+      Logger.error("Failed to stream token instances: #{inspect(reason)}")
+      initial_acc
+  end
-  acc
 end
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
@impl BufferedTask
def init(initial_acc, reducer, _) do
{:ok, acc} =
Instance.stream_token_instances_marked_to_refetch(initial_acc, fn data, acc ->
reducer.(data, acc)
end)
acc
end
@impl BufferedTask
def init(initial_acc, reducer, _) do
case Instance.stream_token_instances_marked_to_refetch(initial_acc, fn data, acc ->
reducer.(data, acc)
end) do
{:ok, acc} -> acc
{:error, reason} ->
Logger.error("Failed to stream token instances: #{inspect(reason)}")
initial_acc
end
end

@vbaranov vbaranov force-pushed the vb-nft-collection-trigger-metadata-refetch-admin-api-endpoint branch 2 times, most recently from e08526d to 3f4f022 Compare January 2, 2025 15:21
Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

🧹 Nitpick comments (3)
apps/block_scout_web/lib/block_scout_web/controllers/api/v2/token_controller.ex (1)

369-388: Provide documentation for the new public endpoint & consider header-based API key.

  1. Add a @doc to clarify the purpose and usage of this new endpoint.
  2. Consider placing the sensitive API key in a header instead of the request body for better security and potential logging avoidance.
apps/indexer/lib/indexer/fetcher/on_demand/nft_collection_metadata_refetch.ex (1)

14-17: Add @doc for public function.

trigger_refetch/1 is a public API for this GenServer, so consider adding a short @doc comment to clarify its purpose and usage.

@doc """
Triggers a refetch operation for the given token.

## Parameters
  - `token`: The token struct whose metadata needs to be refetched.

## Returns
  - `:ok`
"""
@spec trigger_refetch(Token.t()) :: :ok
def trigger_refetch(token) do
  ...
end
docker-compose/envs/common-blockscout.env (1)

201-202: Clarify or remove commented environment variables.

If these vars are desired, provide defaults or uncomment them; otherwise, remove to avoid confusion.

-# INDEXER_TOKEN_INSTANCE_REFETCH_BATCH_SIZE=10
-# INDEXER_TOKEN_INSTANCE_REFETCH_CONCURRENCY=
+INDEXER_TOKEN_INSTANCE_REFETCH_BATCH_SIZE=10
+INDEXER_TOKEN_INSTANCE_REFETCH_CONCURRENCY=5
📜 Review details

Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between fc73b26 and 3f4f022.

📒 Files selected for processing (21)
  • apps/block_scout_web/lib/block_scout_web/controllers/api/v2/token_controller.ex (5 hunks)
  • apps/block_scout_web/lib/block_scout_web/controllers/tokens/instance/holder_controller.ex (2 hunks)
  • apps/block_scout_web/lib/block_scout_web/controllers/tokens/instance/metadata_controller.ex (2 hunks)
  • apps/block_scout_web/lib/block_scout_web/controllers/tokens/instance/transfer_controller.ex (2 hunks)
  • apps/block_scout_web/lib/block_scout_web/controllers/tokens/inventory_controller.ex (2 hunks)
  • apps/block_scout_web/lib/block_scout_web/views/api/v2/address_view.ex (1 hunks)
  • apps/block_scout_web/test/block_scout_web/controllers/api/v2/token_controller_test.exs (2 hunks)
  • apps/explorer/lib/explorer/chain.ex (0 hunks)
  • apps/explorer/lib/explorer/chain/token/instance.ex (16 hunks)
  • apps/explorer/test/explorer/chain/token/instance_test.exs (2 hunks)
  • apps/explorer/test/explorer/chain_test.exs (0 hunks)
  • apps/indexer/lib/indexer/application.ex (3 hunks)
  • apps/indexer/lib/indexer/fetcher/on_demand/nft_collection_metadata_refetch.ex (1 hunks)
  • apps/indexer/lib/indexer/fetcher/token_instance/helper.ex (2 hunks)
  • apps/indexer/lib/indexer/fetcher/token_instance/realtime.ex (1 hunks)
  • apps/indexer/lib/indexer/fetcher/token_instance/refetch.ex (1 hunks)
  • apps/indexer/lib/indexer/fetcher/token_instance/retry.ex (2 hunks)
  • apps/indexer/lib/indexer/fetcher/token_instance/sanitize.ex (3 hunks)
  • apps/indexer/lib/indexer/supervisor.ex (2 hunks)
  • config/runtime.exs (2 hunks)
  • docker-compose/envs/common-blockscout.env (2 hunks)
💤 Files with no reviewable changes (2)
  • apps/explorer/lib/explorer/chain.ex
  • apps/explorer/test/explorer/chain_test.exs
🚧 Files skipped from review as they are similar to previous changes (11)
  • apps/block_scout_web/lib/block_scout_web/controllers/tokens/inventory_controller.ex
  • apps/block_scout_web/lib/block_scout_web/controllers/tokens/instance/transfer_controller.ex
  • apps/block_scout_web/lib/block_scout_web/controllers/tokens/instance/metadata_controller.ex
  • apps/indexer/lib/indexer/application.ex
  • apps/indexer/lib/indexer/supervisor.ex
  • apps/indexer/lib/indexer/fetcher/token_instance/realtime.ex
  • apps/indexer/lib/indexer/fetcher/token_instance/helper.ex
  • apps/block_scout_web/lib/block_scout_web/controllers/tokens/instance/holder_controller.ex
  • apps/block_scout_web/lib/block_scout_web/views/api/v2/address_view.ex
  • apps/indexer/lib/indexer/fetcher/token_instance/retry.ex
  • apps/indexer/lib/indexer/fetcher/token_instance/sanitize.ex
🧰 Additional context used
📓 Path-based instructions (4)
apps/indexer/lib/indexer/fetcher/on_demand/nft_collection_metadata_refetch.ex (1)

Pattern **/*.ex: This is a module written in Elixir. For changed functions here are general guidelines for reviewing:

  1. Context Awareness:
  • Before raising a comment about missing specifications (@SPEC) or documentation (@doc), verify whether they exist in the corresponding code.
  • Check if a single @SPEC or @doc covers multiple clauses of a function (e.g., functions with different argument patterns). Do not raise a comment if one shared @SPEC or @doc exists.
  1. Behavior-Implementing Functions:
  • If a function is preceded by @impl, confirm that it implements a behavior. Do not raise comments for missing @SPEC or @doc if:
    • @SPEC is omitted (this is acceptable).
    • Documentation is omitted or provided in #-style comments.
  1. Predefined Functions:
  • For GenServer functions like child_spec/1 and start_link/1 do not raise comments for missing @SPEC or @doc.
  1. Documentation Format:
  • When reviewing @doc comments, ensure that it follows the following format:
@doc """
  This function does X, Y, and Z and this very first sentence of the comment must not be wrapped if the line exceeds 80 chars.

  The next lines in the documentation comment of any function must be wrapped
  after 80 chars (soft limit). It will allow to read documentation comments
  well in the code editors.

  ## Parameters
  - `param1`: Description of param1
  - `param2`: Description of param2

  ## Returns
  - what the function returns in one case
  - what the function returns if there is more than one possible outcome
"""
  1. Exceptions for Private Functions:
  • For private functions:
    • Skip comments about missing @SPEC if the function has no arguments.
    • Ensure no duplicate @SPEC is required for multiple clauses.

Example Scenarios to Reduce False Positives

  1. Single Specification and Documentation for Multiple Clauses (i.e., different implementations based on pattern matching of arguments):
@doc """
  Provides an example function.

  ## Parameters
  - `integer`: An integer input.

  ## Returns
  - A string result.
"""
@spec example_function(integer()) :: String.t()
def example_function(1), do: "One"
def example_function(2), do: "Two"
  • Do not raise a comment here, as both @SPEC and @doc cover all clauses.
  1. Behavior-Implementing Function:
# Handles an example call.
@impl GenServer
def handle_call(:example, _from, state) do
  {:reply, :ok, state}
end
  • Do not raise a comment for missing @SPEC.
  • Ensure the documentation is valid as a # comment.
  1. Function with No Arguments:
defp no_arg_function, do: :ok
  • Do not raise a comment for a missing @SPEC.
apps/indexer/lib/indexer/fetcher/token_instance/refetch.ex (1)

Pattern **/*.ex: This is a module written in Elixir. For changed functions here are general guidelines for reviewing:

  1. Context Awareness:
  • Before raising a comment about missing specifications (@SPEC) or documentation (@doc), verify whether they exist in the corresponding code.
  • Check if a single @SPEC or @doc covers multiple clauses of a function (e.g., functions with different argument patterns). Do not raise a comment if one shared @SPEC or @doc exists.
  1. Behavior-Implementing Functions:
  • If a function is preceded by @impl, confirm that it implements a behavior. Do not raise comments for missing @SPEC or @doc if:
    • @SPEC is omitted (this is acceptable).
    • Documentation is omitted or provided in #-style comments.
  1. Predefined Functions:
  • For GenServer functions like child_spec/1 and start_link/1 do not raise comments for missing @SPEC or @doc.
  1. Documentation Format:
  • When reviewing @doc comments, ensure that it follows the following format:
@doc """
  This function does X, Y, and Z and this very first sentence of the comment must not be wrapped if the line exceeds 80 chars.

  The next lines in the documentation comment of any function must be wrapped
  after 80 chars (soft limit). It will allow to read documentation comments
  well in the code editors.

  ## Parameters
  - `param1`: Description of param1
  - `param2`: Description of param2

  ## Returns
  - what the function returns in one case
  - what the function returns if there is more than one possible outcome
"""
  1. Exceptions for Private Functions:
  • For private functions:
    • Skip comments about missing @SPEC if the function has no arguments.
    • Ensure no duplicate @SPEC is required for multiple clauses.

Example Scenarios to Reduce False Positives

  1. Single Specification and Documentation for Multiple Clauses (i.e., different implementations based on pattern matching of arguments):
@doc """
  Provides an example function.

  ## Parameters
  - `integer`: An integer input.

  ## Returns
  - A string result.
"""
@spec example_function(integer()) :: String.t()
def example_function(1), do: "One"
def example_function(2), do: "Two"
  • Do not raise a comment here, as both @SPEC and @doc cover all clauses.
  1. Behavior-Implementing Function:
# Handles an example call.
@impl GenServer
def handle_call(:example, _from, state) do
  {:reply, :ok, state}
end
  • Do not raise a comment for missing @SPEC.
  • Ensure the documentation is valid as a # comment.
  1. Function with No Arguments:
defp no_arg_function, do: :ok
  • Do not raise a comment for a missing @SPEC.
apps/block_scout_web/lib/block_scout_web/controllers/api/v2/token_controller.ex (1)

Pattern **/*.ex: This is a module written in Elixir. For changed functions here are general guidelines for reviewing:

  1. Context Awareness:
  • Before raising a comment about missing specifications (@SPEC) or documentation (@doc), verify whether they exist in the corresponding code.
  • Check if a single @SPEC or @doc covers multiple clauses of a function (e.g., functions with different argument patterns). Do not raise a comment if one shared @SPEC or @doc exists.
  1. Behavior-Implementing Functions:
  • If a function is preceded by @impl, confirm that it implements a behavior. Do not raise comments for missing @SPEC or @doc if:
    • @SPEC is omitted (this is acceptable).
    • Documentation is omitted or provided in #-style comments.
  1. Predefined Functions:
  • For GenServer functions like child_spec/1 and start_link/1 do not raise comments for missing @SPEC or @doc.
  1. Documentation Format:
  • When reviewing @doc comments, ensure that it follows the following format:
@doc """
  This function does X, Y, and Z and this very first sentence of the comment must not be wrapped if the line exceeds 80 chars.

  The next lines in the documentation comment of any function must be wrapped
  after 80 chars (soft limit). It will allow to read documentation comments
  well in the code editors.

  ## Parameters
  - `param1`: Description of param1
  - `param2`: Description of param2

  ## Returns
  - what the function returns in one case
  - what the function returns if there is more than one possible outcome
"""
  1. Exceptions for Private Functions:
  • For private functions:
    • Skip comments about missing @SPEC if the function has no arguments.
    • Ensure no duplicate @SPEC is required for multiple clauses.

Example Scenarios to Reduce False Positives

  1. Single Specification and Documentation for Multiple Clauses (i.e., different implementations based on pattern matching of arguments):
@doc """
  Provides an example function.

  ## Parameters
  - `integer`: An integer input.

  ## Returns
  - A string result.
"""
@spec example_function(integer()) :: String.t()
def example_function(1), do: "One"
def example_function(2), do: "Two"
  • Do not raise a comment here, as both @SPEC and @doc cover all clauses.
  1. Behavior-Implementing Function:
# Handles an example call.
@impl GenServer
def handle_call(:example, _from, state) do
  {:reply, :ok, state}
end
  • Do not raise a comment for missing @SPEC.
  • Ensure the documentation is valid as a # comment.
  1. Function with No Arguments:
defp no_arg_function, do: :ok
  • Do not raise a comment for a missing @SPEC.
apps/explorer/lib/explorer/chain/token/instance.ex (1)

Pattern **/*.ex: This is a module written in Elixir. For changed functions here are general guidelines for reviewing:

  1. Context Awareness:
  • Before raising a comment about missing specifications (@SPEC) or documentation (@doc), verify whether they exist in the corresponding code.
  • Check if a single @SPEC or @doc covers multiple clauses of a function (e.g., functions with different argument patterns). Do not raise a comment if one shared @SPEC or @doc exists.
  1. Behavior-Implementing Functions:
  • If a function is preceded by @impl, confirm that it implements a behavior. Do not raise comments for missing @SPEC or @doc if:
    • @SPEC is omitted (this is acceptable).
    • Documentation is omitted or provided in #-style comments.
  1. Predefined Functions:
  • For GenServer functions like child_spec/1 and start_link/1 do not raise comments for missing @SPEC or @doc.
  1. Documentation Format:
  • When reviewing @doc comments, ensure that it follows the following format:
@doc """
  This function does X, Y, and Z and this very first sentence of the comment must not be wrapped if the line exceeds 80 chars.

  The next lines in the documentation comment of any function must be wrapped
  after 80 chars (soft limit). It will allow to read documentation comments
  well in the code editors.

  ## Parameters
  - `param1`: Description of param1
  - `param2`: Description of param2

  ## Returns
  - what the function returns in one case
  - what the function returns if there is more than one possible outcome
"""
  1. Exceptions for Private Functions:
  • For private functions:
    • Skip comments about missing @SPEC if the function has no arguments.
    • Ensure no duplicate @SPEC is required for multiple clauses.

Example Scenarios to Reduce False Positives

  1. Single Specification and Documentation for Multiple Clauses (i.e., different implementations based on pattern matching of arguments):
@doc """
  Provides an example function.

  ## Parameters
  - `integer`: An integer input.

  ## Returns
  - A string result.
"""
@spec example_function(integer()) :: String.t()
def example_function(1), do: "One"
def example_function(2), do: "Two"
  • Do not raise a comment here, as both @SPEC and @doc cover all clauses.
  1. Behavior-Implementing Function:
# Handles an example call.
@impl GenServer
def handle_call(:example, _from, state) do
  {:reply, :ok, state}
end
  • Do not raise a comment for missing @SPEC.
  • Ensure the documentation is valid as a # comment.
  1. Function with No Arguments:
defp no_arg_function, do: :ok
  • Do not raise a comment for a missing @SPEC.
🔇 Additional comments (17)
apps/block_scout_web/lib/block_scout_web/controllers/api/v2/token_controller.ex (5)

11-11: Alias addition looks good.

No concerns here; referencing NFTCollectionMetadataRefetchOnDemand aligns with the new functionality.


201-202: Implementation looks correct.

This call nicely leverages the new nft_instance_by_token_id_and_token_address function, and the with statement handles errors gracefully.


206-206: Owner assignment logic looks good.

Attaching the token owner within the pipeline is a clean approach.


355-356: Refetch metadata call integrated properly.

This code reuses nft_instance_by_token_id_and_token_address under the existing with error flow, which is aligned with the refactor.


178-178: Verify usage consistency.

Switching to Instance.address_to_unique_tokens is consistent with the refactor. Verify that any other references to Chain.address_to_unique_tokens have also been appropriately updated.

✅ Verification successful

All references to address_to_unique_tokens have been properly updated

The verification shows that all instances of address_to_unique_tokens are now using the Instance module instead of Chain. The function appears in three locations:

  • apps/explorer/test/explorer/chain/token/instance_test.exs
  • apps/block_scout_web/lib/block_scout_web/controllers/api/v2/token_controller.ex
  • apps/block_scout_web/lib/block_scout_web/controllers/tokens/inventory_controller.ex

All of these are using the new Instance.address_to_unique_tokens implementation, and there are no remaining references to Chain.address_to_unique_tokens in the codebase.

🏁 Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Searching references to the old function usage to confirm they have all been replaced
rg "Chain\.address_to_unique_tokens"

Length of output: 38


Script:

#!/bin/bash
# Let's search for both old and new function references to get a complete picture
echo "Searching for Instance.address_to_unique_tokens:"
rg "Instance\.address_to_unique_tokens" -A 2

echo -e "\nSearching for any remaining Chain references in token controller:"
fd "token_controller.ex" --exec rg "Chain\." {} -A 2

Length of output: 10449

apps/indexer/lib/indexer/fetcher/token_instance/refetch.ex (1)

28-36: Same comment about error handling for DB streaming.

apps/explorer/test/explorer/chain/token/instance_test.exs (4)

6-6: Alias addition is fine.

No objection to bringing Explorer.PagingOptions into scope here.


85-153: Upsert token instance tests look thorough.

Covers valid and invalid scenarios, including inserting errors and clearing them.


155-208: Pagination test confirmed.

Good approach for verifying scenario-based pagination.


210-229: Add negative scenario tests for NFT instance lookups.

docker-compose/envs/common-blockscout.env (1)

176-176: Enabling token instance refetch fetcher is fine.

Make sure it’s documented in deployment notes.

apps/explorer/lib/explorer/chain/token/instance.ex (4)

15-22: LGTM! Well-structured constant and type declarations.

The new constants and types are well-defined with appropriate values for pagination and metadata refetching.


881-902: LGTM! Well-implemented metadata refetch function.

The mark_nft_collection_to_refetch/1 function is well-documented and efficiently implemented using batch updates with proper error handling.


904-938: LGTM! Well-designed streaming functions.

The streaming functions are efficiently implemented using stream_reduce with proper documentation and error handling.


940-988: LGTM! Well-implemented helper functions.

The helper functions are efficiently implemented using exists? queries with proper documentation and type specifications.

apps/block_scout_web/test/block_scout_web/controllers/api/v2/token_controller_test.exs (1)

1668-1733: LGTM! Comprehensive test coverage for the new endpoint.

The tests thoroughly verify:

  • Successful metadata refetch with API key
  • Unauthorized access handling
  • Database state verification
config/runtime.exs (1)

Line range hint 847-900: LGTM! Well-structured configuration for the new fetcher.

The configuration properly defines:

  • Supervisor settings with disable flag
  • Fetcher settings with configurable concurrency and batch size

@vbaranov vbaranov force-pushed the vb-nft-collection-trigger-metadata-refetch-admin-api-endpoint branch from 3f4f022 to 26cf22c Compare January 2, 2025 16:04
Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 0

🧹 Nitpick comments (8)
config/runtime.exs (1)

897-900: LGTM! Consider documenting the environment variables.

The configuration for the token instance refetch fetcher is well-structured with reasonable defaults for concurrency and batch size. However, it would be beneficial to document these environment variables in the project's documentation to help with deployment configuration.

Environment variables to document:

  • INDEXER_TOKEN_INSTANCE_REFETCH_CONCURRENCY
  • INDEXER_TOKEN_INSTANCE_REFETCH_BATCH_SIZE
apps/explorer/test/explorer/chain/token/instance_test.exs (3)

85-153: Add more test cases for comprehensive coverage.

While the current test cases cover basic scenarios, consider adding these additional test cases for better coverage:

  1. Updating an existing instance with different metadata
  2. Handling of invalid token_id formats (non-numeric values)
  3. Handling of non-existent token contract addresses

Example structure for additional test:

test "updates existing instance with new metadata" do
  token = insert(:token)
  
  # Insert initial instance
  {:ok, instance} = Instance.upsert_token_instance(%{
    token_id: 1,
    token_contract_address_hash: token.contract_address_hash,
    metadata: %{uri: "http://example.com"}
  })
  
  # Update with new metadata
  {:ok, updated} = Instance.upsert_token_instance(%{
    token_id: 1,
    token_contract_address_hash: token.contract_address_hash,
    metadata: %{uri: "http://new-example.com"}
  })
  
  assert updated.id == instance.id
  assert updated.metadata.uri == "http://new-example.com"
end

155-208: Enhance pagination test coverage.

The current test verifies basic pagination functionality, but consider adding these test cases:

  1. Empty result set scenario
  2. Invalid pagination parameters handling
  3. Maximum page size limits

Example structure for empty result test:

test "returns empty list when no tokens exist" do
  token_contract_address = insert(:contract_address)
  token = insert(:token, contract_address: token_contract_address, type: "ERC-721")
  
  paging_options = %PagingOptions{page_size: 10}
  
  result =
    token_contract_address.hash
    |> Instance.address_to_unique_tokens(token, paging_options: paging_options)
    |> Enum.to_list()
    
  assert result == []
end

Line range hint 1-229: Improve test file organization and maintainability.

Consider these structural improvements:

  1. Use setup blocks for common test data (token and contract address creation)
  2. Add documentation for complex test scenarios
  3. Maintain consistent assertion style throughout tests

Example setup block:

describe "token instance operations" do
  setup do
    token_contract_address = insert(:contract_address)
    token = insert(:token, contract_address: token_contract_address, type: "ERC-721")
    
    %{
      token: token,
      contract_address: token_contract_address
    }
  end
  
  test "test case", %{token: token} do
    # Test implementation
  end
end
apps/indexer/test/indexer/fetcher/token_instance/refetch_test.exs (4)

1-3: Consider adding a module-level docstring.
Providing a short @moduledoc can help clarify the test module’s purpose and usage, making it easier to navigate.


8-20: Expand edge case testing for child_spec/1.
Currently, the test verifies merging default options with a provided list. Consider adding a scenario that tests behavior with empty or missing options or conflicting keys to ensure robust coverage.


22-56: Good use of init/3 for initialization logic.
The test verifies that only token instances marked with a specific error state get refetched. It might be beneficial to include a case where the error value is something other than :marked_to_refetch, to confirm that such tokens are excluded from this flow.


58-82: Add negative or no-token tests for run/2.
Testing how the function behaves when no tokens are provided or when tokens do not match the :marked_to_refetch condition would strengthen confidence in the refetch process.

📜 Review details

Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 3f4f022 and 26cf22c.

📒 Files selected for processing (22)
  • apps/block_scout_web/lib/block_scout_web/controllers/api/v2/token_controller.ex (5 hunks)
  • apps/block_scout_web/lib/block_scout_web/controllers/tokens/instance/holder_controller.ex (2 hunks)
  • apps/block_scout_web/lib/block_scout_web/controllers/tokens/instance/metadata_controller.ex (2 hunks)
  • apps/block_scout_web/lib/block_scout_web/controllers/tokens/instance/transfer_controller.ex (2 hunks)
  • apps/block_scout_web/lib/block_scout_web/controllers/tokens/inventory_controller.ex (2 hunks)
  • apps/block_scout_web/lib/block_scout_web/views/api/v2/address_view.ex (1 hunks)
  • apps/block_scout_web/test/block_scout_web/controllers/api/v2/token_controller_test.exs (2 hunks)
  • apps/explorer/lib/explorer/chain.ex (0 hunks)
  • apps/explorer/lib/explorer/chain/token/instance.ex (16 hunks)
  • apps/explorer/test/explorer/chain/token/instance_test.exs (2 hunks)
  • apps/explorer/test/explorer/chain_test.exs (0 hunks)
  • apps/indexer/lib/indexer/application.ex (3 hunks)
  • apps/indexer/lib/indexer/fetcher/on_demand/nft_collection_metadata_refetch.ex (1 hunks)
  • apps/indexer/lib/indexer/fetcher/token_instance/helper.ex (2 hunks)
  • apps/indexer/lib/indexer/fetcher/token_instance/realtime.ex (1 hunks)
  • apps/indexer/lib/indexer/fetcher/token_instance/refetch.ex (1 hunks)
  • apps/indexer/lib/indexer/fetcher/token_instance/retry.ex (2 hunks)
  • apps/indexer/lib/indexer/fetcher/token_instance/sanitize.ex (3 hunks)
  • apps/indexer/lib/indexer/supervisor.ex (2 hunks)
  • apps/indexer/test/indexer/fetcher/token_instance/refetch_test.exs (1 hunks)
  • config/runtime.exs (2 hunks)
  • docker-compose/envs/common-blockscout.env (2 hunks)
💤 Files with no reviewable changes (2)
  • apps/explorer/test/explorer/chain_test.exs
  • apps/explorer/lib/explorer/chain.ex
🚧 Files skipped from review as they are similar to previous changes (14)
  • apps/block_scout_web/lib/block_scout_web/controllers/tokens/instance/transfer_controller.ex
  • apps/block_scout_web/lib/block_scout_web/controllers/tokens/inventory_controller.ex
  • apps/indexer/lib/indexer/fetcher/token_instance/retry.ex
  • apps/indexer/lib/indexer/fetcher/token_instance/helper.ex
  • apps/block_scout_web/lib/block_scout_web/controllers/tokens/instance/holder_controller.ex
  • docker-compose/envs/common-blockscout.env
  • apps/indexer/lib/indexer/fetcher/on_demand/nft_collection_metadata_refetch.ex
  • apps/indexer/lib/indexer/application.ex
  • apps/indexer/lib/indexer/fetcher/token_instance/realtime.ex
  • apps/block_scout_web/lib/block_scout_web/controllers/tokens/instance/metadata_controller.ex
  • apps/indexer/lib/indexer/supervisor.ex
  • apps/indexer/lib/indexer/fetcher/token_instance/sanitize.ex
  • apps/block_scout_web/lib/block_scout_web/controllers/api/v2/token_controller.ex
  • apps/block_scout_web/lib/block_scout_web/views/api/v2/address_view.ex
👮 Files not reviewed due to content moderation or server errors (3)
  • apps/indexer/lib/indexer/fetcher/token_instance/refetch.ex
  • apps/explorer/lib/explorer/chain/token/instance.ex
  • apps/block_scout_web/test/block_scout_web/controllers/api/v2/token_controller_test.exs
🧰 Additional context used
📓 Path-based instructions (2)
apps/indexer/lib/indexer/fetcher/token_instance/refetch.ex (1)

Pattern **/*.ex: This is a module written in Elixir. For changed functions here are general guidelines for reviewing:

  1. Context Awareness:
  • Before raising a comment about missing specifications (@SPEC) or documentation (@doc), verify whether they exist in the corresponding code.
  • Check if a single @SPEC or @doc covers multiple clauses of a function (e.g., functions with different argument patterns). Do not raise a comment if one shared @SPEC or @doc exists.
  1. Behavior-Implementing Functions:
  • If a function is preceded by @impl, confirm that it implements a behavior. Do not raise comments for missing @SPEC or @doc if:
    • @SPEC is omitted (this is acceptable).
    • Documentation is omitted or provided in #-style comments.
  1. Predefined Functions:
  • For GenServer functions like child_spec/1 and start_link/1 do not raise comments for missing @SPEC or @doc.
  1. Documentation Format:
  • When reviewing @doc comments, ensure that it follows the following format:
@doc """
  This function does X, Y, and Z and this very first sentence of the comment must not be wrapped if the line exceeds 80 chars.

  The next lines in the documentation comment of any function must be wrapped
  after 80 chars (soft limit). It will allow to read documentation comments
  well in the code editors.

  ## Parameters
  - `param1`: Description of param1
  - `param2`: Description of param2

  ## Returns
  - what the function returns in one case
  - what the function returns if there is more than one possible outcome
"""
  1. Exceptions for Private Functions:
  • For private functions:
    • Skip comments about missing @SPEC if the function has no arguments.
    • Ensure no duplicate @SPEC is required for multiple clauses.

Example Scenarios to Reduce False Positives

  1. Single Specification and Documentation for Multiple Clauses (i.e., different implementations based on pattern matching of arguments):
@doc """
  Provides an example function.

  ## Parameters
  - `integer`: An integer input.

  ## Returns
  - A string result.
"""
@spec example_function(integer()) :: String.t()
def example_function(1), do: "One"
def example_function(2), do: "Two"
  • Do not raise a comment here, as both @SPEC and @doc cover all clauses.
  1. Behavior-Implementing Function:
# Handles an example call.
@impl GenServer
def handle_call(:example, _from, state) do
  {:reply, :ok, state}
end
  • Do not raise a comment for missing @SPEC.
  • Ensure the documentation is valid as a # comment.
  1. Function with No Arguments:
defp no_arg_function, do: :ok
  • Do not raise a comment for a missing @SPEC.
apps/explorer/lib/explorer/chain/token/instance.ex (1)

Pattern **/*.ex: This is a module written in Elixir. For changed functions here are general guidelines for reviewing:

  1. Context Awareness:
  • Before raising a comment about missing specifications (@SPEC) or documentation (@doc), verify whether they exist in the corresponding code.
  • Check if a single @SPEC or @doc covers multiple clauses of a function (e.g., functions with different argument patterns). Do not raise a comment if one shared @SPEC or @doc exists.
  1. Behavior-Implementing Functions:
  • If a function is preceded by @impl, confirm that it implements a behavior. Do not raise comments for missing @SPEC or @doc if:
    • @SPEC is omitted (this is acceptable).
    • Documentation is omitted or provided in #-style comments.
  1. Predefined Functions:
  • For GenServer functions like child_spec/1 and start_link/1 do not raise comments for missing @SPEC or @doc.
  1. Documentation Format:
  • When reviewing @doc comments, ensure that it follows the following format:
@doc """
  This function does X, Y, and Z and this very first sentence of the comment must not be wrapped if the line exceeds 80 chars.

  The next lines in the documentation comment of any function must be wrapped
  after 80 chars (soft limit). It will allow to read documentation comments
  well in the code editors.

  ## Parameters
  - `param1`: Description of param1
  - `param2`: Description of param2

  ## Returns
  - what the function returns in one case
  - what the function returns if there is more than one possible outcome
"""
  1. Exceptions for Private Functions:
  • For private functions:
    • Skip comments about missing @SPEC if the function has no arguments.
    • Ensure no duplicate @SPEC is required for multiple clauses.

Example Scenarios to Reduce False Positives

  1. Single Specification and Documentation for Multiple Clauses (i.e., different implementations based on pattern matching of arguments):
@doc """
  Provides an example function.

  ## Parameters
  - `integer`: An integer input.

  ## Returns
  - A string result.
"""
@spec example_function(integer()) :: String.t()
def example_function(1), do: "One"
def example_function(2), do: "Two"
  • Do not raise a comment here, as both @SPEC and @doc cover all clauses.
  1. Behavior-Implementing Function:
# Handles an example call.
@impl GenServer
def handle_call(:example, _from, state) do
  {:reply, :ok, state}
end
  • Do not raise a comment for missing @SPEC.
  • Ensure the documentation is valid as a # comment.
  1. Function with No Arguments:
defp no_arg_function, do: :ok
  • Do not raise a comment for a missing @SPEC.
🔇 Additional comments (3)
config/runtime.exs (1)

847-849: LGTM! Supervisor configuration looks good.

The configuration for the token instance refetch supervisor follows the established pattern and integrates well with the existing configuration structure.

apps/explorer/test/explorer/chain/token/instance_test.exs (1)

210-229: Add error case tests for NFT instance retrieval.

The current test only covers the successful retrieval case. Consider adding tests for:

  • Non-existent token_id
  • Invalid token contract address
  • Invalid token_id format (non-numeric)

Example structure for additional tests:

test "returns error for non-existent token_id" do
  token = insert(:token)
  token_id = 999
  
  assert {:error, :not_found} =
    Instance.nft_instance_by_token_id_and_token_address(
      token_id,
      token.contract_address_hash
    )
end
apps/indexer/test/indexer/fetcher/token_instance/refetch_test.exs (1)

4-7: Aliases are well-structured.
These aliases make the code more readable and highlight the relevant modules clearly. No further changes necessary here.

@vbaranov vbaranov force-pushed the vb-nft-collection-trigger-metadata-refetch-admin-api-endpoint branch from 26cf22c to ea06a22 Compare January 2, 2025 17:10
Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

♻️ Duplicate comments (1)
apps/explorer/test/explorer/chain/token/instance_test.exs (1)

210-229: 🛠️ Refactor suggestion

Add comprehensive test cases for NFT instance retrieval

The current test only covers the successful case. As mentioned in the previous review, add tests for error cases. Additionally, consider:

  1. Validation of all instance fields, not just token_id
  2. Case sensitivity in contract address
  3. Performance with large token_ids

Example structure (in addition to previous suggestions):

test "validates case sensitivity in contract address" do
  token = insert(:token)
  token_id = 10
  
  insert(:token_instance,
    token_contract_address_hash: token.contract_address_hash,
    token_id: token_id
  )
  
  uppercase_address = String.upcase(to_string(token.contract_address_hash))
  
  assert {:ok, result} =
    Instance.nft_instance_by_token_id_and_token_address(
      token_id,
      uppercase_address
    )
    
  assert result.token_id == Decimal.new(token_id)
end
🧹 Nitpick comments (8)
apps/block_scout_web/test/block_scout_web/controllers/api/v2/token_controller_test.exs (2)

1693-1725: Consider adding edge case tests.

While the happy path test is thorough, consider adding tests for:

  • Token with no instances
  • Token with a large number of instances (pagination test)
  • Different token types (ERC-1155, ERC-404)

1727-1734: Consider additional error cases.

The unauthorized test is good, but consider adding tests for:

  • Invalid token address
  • Non-existent token
  • Invalid API key format
apps/explorer/test/explorer/chain/token/instance_test.exs (3)

85-153: Enhance test coverage for upsert_token_instance/1

Consider adding these test cases to improve coverage:

  1. Edge cases for token_id:
    • Maximum/minimum allowed values
    • Non-numeric values
  2. Metadata validation:
    • Invalid URI format
    • Missing required metadata fields
  3. Concurrent upsert operations:
    • Race condition handling

Example structure:

test "handles maximum token_id value" do
  token = insert(:token)
  max_token_id = # Define your max value
  
  params = %{
    token_id: max_token_id,
    token_contract_address_hash: token.contract_address_hash,
    metadata: %{uri: "http://example.com"}
  }
  
  {:ok, result} = Instance.upsert_token_instance(params)
  assert result.token_id == Decimal.new(max_token_id)
end

155-208: Enhance pagination test coverage

The current test only covers basic forward pagination. Consider adding:

  1. Backward pagination test
  2. Empty result set test
  3. Verification of complete token instance data, not just token_id
  4. Edge cases for page_size

Example structure:

test "handles empty result set with pagination" do
  token_contract_address = insert(:contract_address)
  token = insert(:token, contract_address: token_contract_address, type: "ERC-721")
  
  paging_options = %PagingOptions{page_size: 10}
  
  result =
    token_contract_address.hash
    |> Instance.address_to_unique_tokens(token, paging_options: paging_options)
    |> Enum.to_list()
    
  assert result == []
end

Line range hint 1-6: Add module documentation for test strategy

Consider adding module documentation to:

  1. Describe the test strategy and coverage goals
  2. List any assumptions or limitations
  3. Explain the test data setup approach

Example:

defmodule Explorer.Chain.Token.InstanceTest do
  @moduledoc """
  Test suite for Token Instance operations.
  
  ## Test Strategy
  - Covers CRUD operations for token instances
  - Validates pagination and retrieval mechanisms
  - Tests error conditions and edge cases
  
  ## Test Data Setup
  Fixtures are used to create tokens and instances with specific attributes
  needed for each test scenario.
  """
  
  use Explorer.DataCase
  # ... rest of the code
apps/explorer/lib/explorer/chain/token/instance.ex (3)

922-924: Fix copy-pasted documentation comment.

The documentation for stream_token_instances_marked_to_refetch/2 appears to be copied from stream_token_instances_with_unfetched_metadata/2. It should be updated to accurately describe its specific purpose of streaming instances marked for refetch.


1132-1195: Consider extracting complex SQL logic into named functions.

The token instance metadata update logic uses complex SQL fragments that could benefit from being extracted into named functions for better maintainability and testability. Consider creating helper functions for:

  • Retry count calculation
  • Ban status determination
  • Refetch interval computation

993-1022: Consider externalizing error priority configuration.

The error priority lists are hardcoded in the function. Consider moving them to the configuration:

+  @high_priority_errors ["request error: 429", ":checkout_timeout"]
+  @negative_priority_errors ["VM execution error", "no uri", "invalid json"]

   def stream_token_instances_with_error(initial, reducer, limited? \\ false) when is_function(reducer, 2) do
-    # likely to get valid metadata
-    high_priority = ["request error: 429", ":checkout_timeout"]
-    # almost impossible to get valid metadata
-    negative_priority = ["VM execution error", "no uri", "invalid json"]

     __MODULE__
     |> where([instance], is_nil(instance.is_banned) or not instance.is_banned)
     |> where([instance], not is_nil(instance.error))
     |> where([instance], instance.error != ^@marked_to_refetch)
     |> where([instance], is_nil(instance.refetch_after) or instance.refetch_after < ^DateTime.utc_now())
     |> select([instance], %{
       contract_address_hash: instance.token_contract_address_hash,
       token_id: instance.token_id
     })
     |> order_by([instance],
       asc: instance.refetch_after,
-      desc: instance.error in ^high_priority,
-      asc: instance.error in ^negative_priority
+      desc: instance.error in ^@high_priority_errors,
+      asc: instance.error in ^@negative_priority_errors
     )
📜 Review details

Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 26cf22c and ea06a22.

📒 Files selected for processing (22)
  • apps/block_scout_web/lib/block_scout_web/controllers/api/v2/token_controller.ex (5 hunks)
  • apps/block_scout_web/lib/block_scout_web/controllers/tokens/instance/holder_controller.ex (2 hunks)
  • apps/block_scout_web/lib/block_scout_web/controllers/tokens/instance/metadata_controller.ex (2 hunks)
  • apps/block_scout_web/lib/block_scout_web/controllers/tokens/instance/transfer_controller.ex (2 hunks)
  • apps/block_scout_web/lib/block_scout_web/controllers/tokens/inventory_controller.ex (2 hunks)
  • apps/block_scout_web/lib/block_scout_web/views/api/v2/address_view.ex (1 hunks)
  • apps/block_scout_web/test/block_scout_web/controllers/api/v2/token_controller_test.exs (2 hunks)
  • apps/explorer/lib/explorer/chain.ex (0 hunks)
  • apps/explorer/lib/explorer/chain/token/instance.ex (16 hunks)
  • apps/explorer/test/explorer/chain/token/instance_test.exs (2 hunks)
  • apps/explorer/test/explorer/chain_test.exs (0 hunks)
  • apps/indexer/lib/indexer/application.ex (3 hunks)
  • apps/indexer/lib/indexer/fetcher/on_demand/nft_collection_metadata_refetch.ex (1 hunks)
  • apps/indexer/lib/indexer/fetcher/token_instance/helper.ex (2 hunks)
  • apps/indexer/lib/indexer/fetcher/token_instance/realtime.ex (1 hunks)
  • apps/indexer/lib/indexer/fetcher/token_instance/refetch.ex (1 hunks)
  • apps/indexer/lib/indexer/fetcher/token_instance/retry.ex (2 hunks)
  • apps/indexer/lib/indexer/fetcher/token_instance/sanitize.ex (3 hunks)
  • apps/indexer/lib/indexer/supervisor.ex (2 hunks)
  • apps/indexer/test/indexer/fetcher/token_instance/refetch_test.exs (1 hunks)
  • config/runtime.exs (2 hunks)
  • docker-compose/envs/common-blockscout.env (2 hunks)
💤 Files with no reviewable changes (2)
  • apps/explorer/test/explorer/chain_test.exs
  • apps/explorer/lib/explorer/chain.ex
🚧 Files skipped from review as they are similar to previous changes (16)
  • apps/indexer/lib/indexer/fetcher/token_instance/retry.ex
  • apps/block_scout_web/lib/block_scout_web/controllers/tokens/inventory_controller.ex
  • apps/block_scout_web/lib/block_scout_web/controllers/tokens/instance/metadata_controller.ex
  • apps/block_scout_web/lib/block_scout_web/controllers/tokens/instance/transfer_controller.ex
  • apps/indexer/lib/indexer/fetcher/on_demand/nft_collection_metadata_refetch.ex
  • apps/indexer/lib/indexer/fetcher/token_instance/realtime.ex
  • apps/indexer/lib/indexer/fetcher/token_instance/helper.ex
  • docker-compose/envs/common-blockscout.env
  • apps/indexer/lib/indexer/application.ex
  • apps/indexer/lib/indexer/supervisor.ex
  • apps/block_scout_web/lib/block_scout_web/controllers/tokens/instance/holder_controller.ex
  • apps/indexer/lib/indexer/fetcher/token_instance/refetch.ex
  • apps/block_scout_web/lib/block_scout_web/views/api/v2/address_view.ex
  • apps/indexer/lib/indexer/fetcher/token_instance/sanitize.ex
  • apps/block_scout_web/lib/block_scout_web/controllers/api/v2/token_controller.ex
  • apps/indexer/test/indexer/fetcher/token_instance/refetch_test.exs
🧰 Additional context used
📓 Path-based instructions (1)
apps/explorer/lib/explorer/chain/token/instance.ex (1)

Pattern **/*.ex: This is a module written in Elixir. For changed functions here are general guidelines for reviewing:

  1. Context Awareness:
  • Before raising a comment about missing specifications (@SPEC) or documentation (@doc), verify whether they exist in the corresponding code.
  • Check if a single @SPEC or @doc covers multiple clauses of a function (e.g., functions with different argument patterns). Do not raise a comment if one shared @SPEC or @doc exists.
  1. Behavior-Implementing Functions:
  • If a function is preceded by @impl, confirm that it implements a behavior. Do not raise comments for missing @SPEC or @doc if:
    • @SPEC is omitted (this is acceptable).
    • Documentation is omitted or provided in #-style comments.
  1. Predefined Functions:
  • For GenServer functions like child_spec/1 and start_link/1 do not raise comments for missing @SPEC or @doc.
  1. Documentation Format:
  • When reviewing @doc comments, ensure that it follows the following format:
@doc """
  This function does X, Y, and Z and this very first sentence of the comment must not be wrapped if the line exceeds 80 chars.

  The next lines in the documentation comment of any function must be wrapped
  after 80 chars (soft limit). It will allow to read documentation comments
  well in the code editors.

  ## Parameters
  - `param1`: Description of param1
  - `param2`: Description of param2

  ## Returns
  - what the function returns in one case
  - what the function returns if there is more than one possible outcome
"""
  1. Exceptions for Private Functions:
  • For private functions:
    • Skip comments about missing @SPEC if the function has no arguments.
    • Ensure no duplicate @SPEC is required for multiple clauses.

Example Scenarios to Reduce False Positives

  1. Single Specification and Documentation for Multiple Clauses (i.e., different implementations based on pattern matching of arguments):
@doc """
  Provides an example function.

  ## Parameters
  - `integer`: An integer input.

  ## Returns
  - A string result.
"""
@spec example_function(integer()) :: String.t()
def example_function(1), do: "One"
def example_function(2), do: "Two"
  • Do not raise a comment here, as both @SPEC and @doc cover all clauses.
  1. Behavior-Implementing Function:
# Handles an example call.
@impl GenServer
def handle_call(:example, _from, state) do
  {:reply, :ok, state}
end
  • Do not raise a comment for missing @SPEC.
  • Ensure the documentation is valid as a # comment.
  1. Function with No Arguments:
defp no_arg_function, do: :ok
  • Do not raise a comment for a missing @SPEC.
🔇 Additional comments (10)
config/runtime.exs (2)

847-849: LGTM! Configuration for token instance refetch supervisor

The configuration aligns with the PR objectives for NFT collection metadata refetch functionality, following the established pattern for supervisor configurations.


897-900: LGTM! Configuration for token instance refetch fetcher

The configuration properly sets up concurrency and batch size parameters for the NFT collection metadata refetch feature, with reasonable defaults that can be overridden via environment variables.

apps/block_scout_web/test/block_scout_web/controllers/api/v2/token_controller_test.exs (2)

15-15: LGTM! Alias follows consistent pattern.

The alias for NFTCollectionMetadataRefetch is properly placed and follows the established naming convention.


1673-1691: LGTM! Setup is comprehensive and includes cleanup.

The test setup properly configures the environment, starts required processes, and includes cleanup in on_exit. Good practice!

apps/explorer/lib/explorer/chain/token/instance.ex (6)

11-22: Well-structured module attributes and type definitions!

The new constants and type definitions are clear, properly documented, and follow Elixir conventions.


Line range hint 67-85: Good use of __MODULE__ for better maintainability!

The change from explicit module name to __MODULE__ makes the code more maintainable and reduces the risk of errors during module renaming.


96-102: Improved type safety with proper @SPEC annotation!

The function is well-documented and the changes enhance type safety while maintaining consistent use of __MODULE__.


881-902: Well-implemented NFT collection refetch functionality!

The new function is thoroughly documented with proper @doc and @SPEC annotations, and implements the core functionality required by the PR objectives.


Line range hint 645-880: Well-implemented media handling functionality!

The media handling functions are well-documented, properly handle errors, and follow best practices for CDN integration.


Line range hint 1-1195: Solid implementation with good documentation and error handling!

The module successfully implements the NFT collection metadata refetch functionality while maintaining good documentation coverage and proper error handling. The changes align well with the PR objectives and follow Elixir best practices.

Comment on lines 1089 to 1272
@doc """
Put owner address to unique token instance. If not unique, return original instance.
"""
@spec put_owner_to_token_instance(__MODULE__.t(), Token.t(), [api?]) :: __MODULE__.t()
def put_owner_to_token_instance(token_instance, token, options \\ [])

def put_owner_to_token_instance(%__MODULE__{is_unique: nil} = token_instance, token, options) do
put_owner_to_token_instance(__MODULE__.put_is_unique(token_instance, token, options), token, options)
end

def put_owner_to_token_instance(
%__MODULE__{owner: nil, is_unique: true} = token_instance,
%Token{type: type},
options
)
when type in ["ERC-1155", "ERC-404"] do
owner_address_hash =
token_instance
|> __MODULE__.owner_query()
|> Chain.select_repo(options).one()

owner =
Address.get(
owner_address_hash,
options
|> Keyword.merge(
necessity_by_association: %{
:names => :optional,
:smart_contract => :optional,
Implementation.proxy_implementations_association() => :optional
}
)
)

%{token_instance | owner: owner}
end

def put_owner_to_token_instance(%__MODULE__{} = token_instance, _token, _options), do: token_instance
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Add error handling for database queries.

The owner query operations could fail due to database errors. Consider adding proper error handling:

 def put_owner_to_token_instance(
   %__MODULE__{owner: nil, is_unique: true} = token_instance,
   %Token{type: type},
   options
 )
 when type in ["ERC-1155", "ERC-404"] do
-  owner_address_hash =
-    token_instance
-    |> __MODULE__.owner_query()
-    |> Chain.select_repo(options).one()
+  with {:ok, owner_address_hash} <- 
+    token_instance
+    |> __MODULE__.owner_query()
+    |> Chain.select_repo(options).one()
+    |> case do
+      nil -> {:error, :owner_not_found}
+      address_hash -> {:ok, address_hash}
+    end,
+  {:ok, owner} <- 
+    Address.get(
+      owner_address_hash,
+      options
+      |> Keyword.merge(
+        necessity_by_association: %{
+          :names => :optional,
+          :smart_contract => :optional,
+          Implementation.proxy_implementations_association() => :optional
+        }
+      )
+    ) do
+    %{token_instance | owner: owner}
+  else
+    {:error, _reason} -> token_instance
+  end

Committable suggestion skipped: line range outside the PR's diff.

flush_interval: :infinity,
max_concurrency: Application.get_env(:indexer, __MODULE__)[:concurrency] || @default_max_concurrency,
max_batch_size: Application.get_env(:indexer, __MODULE__)[:batch_size] || @default_max_batch_size,
poll: false,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We either should set poll: true, or somehow trigger fetching from the NFTCollectionMetadataRefetch

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Processed in 570d370

@@ -134,6 +135,7 @@ defmodule Indexer.Supervisor do
{TokenInstanceRealtime.Supervisor, [[memory_monitor: memory_monitor]]},
{TokenInstanceRetry.Supervisor, [[memory_monitor: memory_monitor]]},
{TokenInstanceSanitize.Supervisor, [[memory_monitor: memory_monitor]]},
{TokenInstanceRefetch.Supervisor, [[memory_monitor: memory_monitor]]},
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We probably should place that supervisor in application.ex:

{NFTCollectionMetadataRefetchOnDemand.Supervisor, [json_rpc_named_arguments]},
in order to have possibility to refetch in API application mode

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Processed in 570d370

@sl1depengwyn sl1depengwyn requested a review from a team January 31, 2025 05:18
@vbaranov vbaranov force-pushed the vb-nft-collection-trigger-metadata-refetch-admin-api-endpoint branch from ea06a22 to 5c63a5d Compare February 4, 2025 16:04
Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

♻️ Duplicate comments (1)
apps/block_scout_web/test/block_scout_web/controllers/api/v2/token_controller_test.exs (1)

1693-1725: 🛠️ Refactor suggestion

Replace sleep with proper test synchronization.

Using :timer.sleep(100) can make tests flaky. Consider using ExUnit's assert_receive or a similar mechanism.

-:timer.sleep(100)
+assert_receive {:metadata_refetch_complete, _}, 1000
🧹 Nitpick comments (3)
apps/explorer/lib/explorer/chain.ex (1)

3754-3770: LGTM! Consider enhancing the documentation.

The changes to stream_cataloged_tokens look good, with proper ordering and fetcher limit handling. Consider adding documentation for the some_time_ago_updated parameter to explain its unit (minutes) and purpose.

  @doc """
  Streams a list of tokens that have been cataloged.
+
+ ## Parameters
+   * `some_time_ago_updated` - Number of minutes to look back for token updates. Defaults to 2880 (48 hours).
  """
apps/explorer/test/explorer/chain/token/instance_test.exs (1)

155-208: Add test case for empty token_ids.

The pagination test for unique tokens should also verify behavior when token_ids is empty or nil.

 describe "address_to_unique_tokens/2" do
+  test "unique tokens can be paginated through token_id with empty token_ids" do
+    token_contract_address = insert(:contract_address)
+    token = insert(:token, contract_address: token_contract_address, type: "ERC-721")
+
+    transaction =
+      :transaction
+      |> insert()
+      |> with_block(insert(:block, number: 1))
+
+    insert(
+      :token_transfer,
+      block_number: 1000,
+      to_address: build(:address),
+      transaction: transaction,
+      token_contract_address: token_contract_address,
+      token: token,
+      token_ids: []
+    )
+
+    assert [] = Instance.address_to_unique_tokens(token_contract_address.hash, token)
+  end
apps/explorer/lib/explorer/chain/token/instance.ex (1)

1193-1205: Add logging for better observability.

Consider adding logging to track the progress of token instance processing.

 def address_to_unique_tokens(contract_address_hash, token, options \\ []) do
+  require Logger
+  Logger.debug("Fetching unique tokens for contract #{inspect(contract_address_hash)}")
   paging_options = Keyword.get(options, :paging_options, @default_paging_options)
📜 Review details

Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between ea06a22 and 5c63a5d.

📒 Files selected for processing (14)
  • apps/block_scout_web/lib/block_scout_web/controllers/api/v2/token_controller.ex (5 hunks)
  • apps/block_scout_web/lib/block_scout_web/controllers/tokens/instance/holder_controller.ex (2 hunks)
  • apps/block_scout_web/lib/block_scout_web/controllers/tokens/instance/metadata_controller.ex (2 hunks)
  • apps/block_scout_web/lib/block_scout_web/controllers/tokens/instance/transfer_controller.ex (2 hunks)
  • apps/block_scout_web/lib/block_scout_web/controllers/tokens/inventory_controller.ex (2 hunks)
  • apps/block_scout_web/lib/block_scout_web/routers/tokens_api_v2_router.ex (1 hunks)
  • apps/block_scout_web/lib/block_scout_web/views/api/v2/address_view.ex (1 hunks)
  • apps/block_scout_web/test/block_scout_web/controllers/api/v2/token_controller_test.exs (2 hunks)
  • apps/explorer/lib/explorer/chain.ex (1 hunks)
  • apps/explorer/lib/explorer/chain/token/instance.ex (18 hunks)
  • apps/explorer/test/explorer/chain/token/instance_test.exs (2 hunks)
  • apps/explorer/test/explorer/chain_test.exs (0 hunks)
  • apps/indexer/lib/indexer/application.ex (3 hunks)
  • apps/indexer/lib/indexer/fetcher/on_demand/nft_collection_metadata_refetch.ex (1 hunks)
💤 Files with no reviewable changes (1)
  • apps/explorer/test/explorer/chain_test.exs
🚧 Files skipped from review as they are similar to previous changes (9)
  • apps/block_scout_web/lib/block_scout_web/routers/tokens_api_v2_router.ex
  • apps/indexer/lib/indexer/application.ex
  • apps/block_scout_web/lib/block_scout_web/controllers/tokens/instance/metadata_controller.ex
  • apps/indexer/lib/indexer/fetcher/on_demand/nft_collection_metadata_refetch.ex
  • apps/block_scout_web/lib/block_scout_web/views/api/v2/address_view.ex
  • apps/block_scout_web/lib/block_scout_web/controllers/tokens/instance/holder_controller.ex
  • apps/block_scout_web/lib/block_scout_web/controllers/tokens/inventory_controller.ex
  • apps/block_scout_web/lib/block_scout_web/controllers/tokens/instance/transfer_controller.ex
  • apps/block_scout_web/lib/block_scout_web/controllers/api/v2/token_controller.ex
🧰 Additional context used
📓 Path-based instructions (2)
apps/explorer/lib/explorer/chain/token/instance.ex (1)

Pattern **/*.ex: This is a module written in Elixir. For changed functions here are general guidelines for reviewing:

  1. Context Awareness:
  • Before raising a comment about missing specifications (@SPEC) or documentation (@doc), verify whether they exist in the corresponding code.
  • Check if a single @SPEC or @doc covers multiple clauses of a function (e.g., functions with different argument patterns). Do not raise a comment if one shared @SPEC or @doc exists.
  1. Behavior-Implementing Functions:
  • If a function is preceded by @impl, confirm that it implements a behavior. Do not raise comments for missing @SPEC or @doc if:
    • @SPEC is omitted (this is acceptable).
    • Documentation is omitted or provided in #-style comments.
  1. Predefined Functions:
  • For GenServer functions like child_spec/1 and start_link/1 do not raise comments for missing @SPEC or @doc.
  1. Documentation Format:
  • When reviewing @doc comments, ensure that it follows the following format:
@doc """
  This function does X, Y, and Z and this very first sentence of the comment must not be wrapped if the line exceeds 80 chars.

  The next lines in the documentation comment of any function must be wrapped
  after 80 chars (soft limit). It will allow to read documentation comments
  well in the code editors.

  ## Parameters
  - `param1`: Description of param1
  - `param2`: Description of param2

  ## Returns
  - what the function returns in one case
  - what the function returns if there is more than one possible outcome
"""
  1. Exceptions for Private Functions:
  • For private functions:
    • Skip comments about missing @SPEC if the function has no arguments.
    • Ensure no duplicate @SPEC is required for multiple clauses.

Example Scenarios to Reduce False Positives

  1. Single Specification and Documentation for Multiple Clauses (i.e., different implementations based on pattern matching of arguments):
@doc """
  Provides an example function.

  ## Parameters
  - `integer`: An integer input.

  ## Returns
  - A string result.
"""
@spec example_function(integer()) :: String.t()
def example_function(1), do: "One"
def example_function(2), do: "Two"
  • Do not raise a comment here, as both @SPEC and @doc cover all clauses.
  1. Behavior-Implementing Function:
# Handles an example call.
@impl GenServer
def handle_call(:example, _from, state) do
  {:reply, :ok, state}
end
  • Do not raise a comment for missing @SPEC.
  • Ensure the documentation is valid as a # comment.
  1. Function with No Arguments:
defp no_arg_function, do: :ok
  • Do not raise a comment for a missing @SPEC.
apps/explorer/lib/explorer/chain.ex (1)

Pattern **/*.ex: This is a module written in Elixir. For changed functions here are general guidelines for reviewing:

  1. Context Awareness:
  • Before raising a comment about missing specifications (@SPEC) or documentation (@doc), verify whether they exist in the corresponding code.
  • Check if a single @SPEC or @doc covers multiple clauses of a function (e.g., functions with different argument patterns). Do not raise a comment if one shared @SPEC or @doc exists.
  1. Behavior-Implementing Functions:
  • If a function is preceded by @impl, confirm that it implements a behavior. Do not raise comments for missing @SPEC or @doc if:
    • @SPEC is omitted (this is acceptable).
    • Documentation is omitted or provided in #-style comments.
  1. Predefined Functions:
  • For GenServer functions like child_spec/1 and start_link/1 do not raise comments for missing @SPEC or @doc.
  1. Documentation Format:
  • When reviewing @doc comments, ensure that it follows the following format:
@doc """
  This function does X, Y, and Z and this very first sentence of the comment must not be wrapped if the line exceeds 80 chars.

  The next lines in the documentation comment of any function must be wrapped
  after 80 chars (soft limit). It will allow to read documentation comments
  well in the code editors.

  ## Parameters
  - `param1`: Description of param1
  - `param2`: Description of param2

  ## Returns
  - what the function returns in one case
  - what the function returns if there is more than one possible outcome
"""
  1. Exceptions for Private Functions:
  • For private functions:
    • Skip comments about missing @SPEC if the function has no arguments.
    • Ensure no duplicate @SPEC is required for multiple clauses.

Example Scenarios to Reduce False Positives

  1. Single Specification and Documentation for Multiple Clauses (i.e., different implementations based on pattern matching of arguments):
@doc """
  Provides an example function.

  ## Parameters
  - `integer`: An integer input.

  ## Returns
  - A string result.
"""
@spec example_function(integer()) :: String.t()
def example_function(1), do: "One"
def example_function(2), do: "Two"
  • Do not raise a comment here, as both @SPEC and @doc cover all clauses.
  1. Behavior-Implementing Function:
# Handles an example call.
@impl GenServer
def handle_call(:example, _from, state) do
  {:reply, :ok, state}
end
  • Do not raise a comment for missing @SPEC.
  • Ensure the documentation is valid as a # comment.
  1. Function with No Arguments:
defp no_arg_function, do: :ok
  • Do not raise a comment for a missing @SPEC.
⏰ Context from checks skipped due to timeout of 90000ms (4)
  • GitHub Check: Indexer Tests (default)
  • GitHub Check: Explorer Tests (default)
  • GitHub Check: Blockscout Web Tests (default)
  • GitHub Check: Dialyzer static analysis (default)
🔇 Additional comments (4)
apps/explorer/test/explorer/chain/token/instance_test.exs (2)

85-153: LGTM! Well-structured tests for upsert_token_instance/1.

The test cases cover essential scenarios:

  • Successful insertion with valid params
  • Validation failure with invalid params
  • Error handling without metadata
  • Error nullification with valid metadata

210-229: Add negative test cases for nft_instance_by_token_id_and_token_address/2.

The test suite should include cases for non-existent token IDs and invalid token addresses.

apps/explorer/lib/explorer/chain/token/instance.ex (1)

1030-1050: LGTM! Well-documented function for marking NFT collection for metadata refetch.

The implementation correctly handles:

  • Setting metadata to nil
  • Setting error status
  • Using a timeout for the database operation
apps/block_scout_web/test/block_scout_web/controllers/api/v2/token_controller_test.exs (1)

1727-1735: LGTM! Good test coverage for unauthorized access.

The test properly verifies that the endpoint requires an API key for access.

Comment on lines +1055 to +1068
@spec stream_token_instances_with_unfetched_metadata(
initial :: accumulator,
reducer :: (entry :: map(), accumulator -> accumulator)
) :: {:ok, accumulator}
when accumulator: term()
def stream_token_instances_with_unfetched_metadata(initial, reducer) when is_function(reducer, 2) do
__MODULE__
|> where([instance], is_nil(instance.error) and is_nil(instance.metadata))
|> select([instance], %{
contract_address_hash: instance.token_contract_address_hash,
token_id: instance.token_id
})
|> Repo.stream_reduce(initial, reducer)
end
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Add error handling for database operations.

The stream operation could fail due to database errors. Consider wrapping it in a try/rescue block.

 def stream_token_instances_with_unfetched_metadata(initial, reducer) when is_function(reducer, 2) do
+  try do
     __MODULE__
     |> where([instance], is_nil(instance.error) and is_nil(instance.metadata))
     |> select([instance], %{
       contract_address_hash: instance.token_contract_address_hash,
       token_id: instance.token_id
     })
     |> Repo.stream_reduce(initial, reducer)
+  rescue
+    e in Ecto.QueryError ->
+      {:error, e}
+  end
 end
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
@spec stream_token_instances_with_unfetched_metadata(
initial :: accumulator,
reducer :: (entry :: map(), accumulator -> accumulator)
) :: {:ok, accumulator}
when accumulator: term()
def stream_token_instances_with_unfetched_metadata(initial, reducer) when is_function(reducer, 2) do
__MODULE__
|> where([instance], is_nil(instance.error) and is_nil(instance.metadata))
|> select([instance], %{
contract_address_hash: instance.token_contract_address_hash,
token_id: instance.token_id
})
|> Repo.stream_reduce(initial, reducer)
end
@spec stream_token_instances_with_unfetched_metadata(
initial :: accumulator,
reducer :: (entry :: map(), accumulator -> accumulator)
) :: {:ok, accumulator}
when accumulator: term()
def stream_token_instances_with_unfetched_metadata(initial, reducer) when is_function(reducer, 2) do
try do
__MODULE__
|> where([instance], is_nil(instance.error) and is_nil(instance.metadata))
|> select([instance], %{
contract_address_hash: instance.token_contract_address_hash,
token_id: instance.token_id
})
|> Repo.stream_reduce(initial, reducer)
rescue
e in Ecto.QueryError ->
{:error, e}
end
end

@vbaranov vbaranov requested a review from nikitosing February 4, 2025 16:30
GenServer.cast(__MODULE__, {:refetch, token})
end

defp fetch_metadata(token, _state) do
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nitpick: let's remove state arg from this function?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done in b4ed7b0

Comment on lines 3763 to 3768
def stream_cataloged_tokens(initial, reducer, some_time_ago_updated \\ 2880, limited? \\ false)
when is_function(reducer, 2) do
some_time_ago_updated
|> Token.cataloged_tokens()
|> add_fetcher_limit(limited?)
|> order_by(asc: :updated_at)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is it artifact? We already have this function in token.ex

def stream_cataloged_tokens(initial, reducer, some_time_ago_updated \\ 2880, limited? \\ false)

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Processed in b4ed7b0

@@ -49,6 +49,12 @@ defmodule BlockScoutWeb.Routers.TokensApiV2Router do
pipe_through(:api_v2_no_forgery_protect)

patch("/:address_hash_param/instances/:token_id/refetch-metadata", V2.TokenController, :refetch_metadata)

post(
"/:address_hash_param/instances/trigger-nft-collection-metadata-refetch",
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let's make this endpoint consistent with previous one and rename it to /:address_hash_param/instances/refetch-metadata and replace post with patch. Wdyt?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Initially, I intentionally changed the method and the path because, unlike a single refetch-metadata endpoint, this endpoint just triggers the update process and does not directly set the new data. However, from API consumer pov consistency in API endpoint namings probably prevails over the exactness of matching the endpoint's name and the action it produces. Renamed in 7a30e6c.

Comment on lines 1278 to 1287
@spec upsert_token_instance(map()) :: {:ok, __MODULE__.t()} | {:error, Ecto.Changeset.t()}
def upsert_token_instance(params) do
changeset = __MODULE__.changeset(%__MODULE__{}, params)
max_retries_count_before_ban = __MODULE__.error_to_max_retries_count_before_ban(params[:error])

Repo.insert(changeset,
on_conflict: token_instance_metadata_on_conflict(max_retries_count_before_ban),
conflict_target: [:token_id, :token_contract_address_hash]
)
end
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is it artifact also? this function was replaced by batch_upsert_token_instances in #11685

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Processed in b4ed7b0

@nikitosing nikitosing requested a review from a team February 6, 2025 06:33
Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

♻️ Duplicate comments (3)
apps/indexer/lib/indexer/fetcher/token_instance/refetch.ex (1)

28-36: ⚠️ Potential issue

Consider adding error handling for stream_token_instances_marked_to_refetch.

The init/3 function assumes the stream operation will always succeed. Consider handling potential database errors.

 @impl BufferedTask
 def init(initial_acc, reducer, _) do
-  {:ok, acc} =
-    Instance.stream_token_instances_marked_to_refetch(initial_acc, fn data, acc ->
-      reducer.(data, acc)
-    end)
+  case Instance.stream_token_instances_marked_to_refetch(initial_acc, fn data, acc ->
+    reducer.(data, acc)
+  end) do
+    {:ok, acc} -> acc
+    {:error, reason} ->
+      Logger.error("Failed to stream token instances: #{inspect(reason)}")
+      initial_acc
+  end
-  acc
 end
apps/explorer/lib/explorer/chain/token/instance.ex (2)

1241-1272: ⚠️ Potential issue

Add error handling for database operations.

The function should handle potential database errors when querying and fetching owner information.

Apply this diff to add proper error handling:

 def put_owner_to_token_instance(
       %__MODULE__{owner: nil, is_unique: true} = token_instance,
       %Token{type: type},
       options
     )
     when type in ["ERC-1155", "ERC-404"] do
-  owner_address_hash =
-    token_instance
-    |> __MODULE__.owner_query()
-    |> Chain.select_repo(options).one()
-
-  owner =
-    Address.get(
-      owner_address_hash,
-      options
-      |> Keyword.merge(
-        necessity_by_association: %{
-          :names => :optional,
-          :smart_contract => :optional,
-          Implementation.proxy_implementations_association() => :optional
-        }
-      )
-    )
-
-  %{token_instance | owner: owner, owner_address_hash: owner_address_hash}
+  with {:ok, owner_address_hash} <- 
+    token_instance
+    |> __MODULE__.owner_query()
+    |> Chain.select_repo(options).one()
+    |> case do
+      nil -> {:error, :owner_not_found}
+      address_hash -> {:ok, address_hash}
+    end,
+  {:ok, owner} <- 
+    Address.get(
+      owner_address_hash,
+      options
+      |> Keyword.merge(
+        necessity_by_association: %{
+          :names => :optional,
+          :smart_contract => :optional,
+          Implementation.proxy_implementations_association() => :optional
+        }
+      )
+    ) do
+    %{token_instance | owner: owner, owner_address_hash: owner_address_hash}
+  else
+    {:error, _reason} -> token_instance
+  end

1055-1068: ⚠️ Potential issue

Add error handling for database stream operations.

The stream operation could fail due to database errors. Consider adding proper error handling.

Apply this diff to add error handling:

 def stream_token_instances_with_unfetched_metadata(initial, reducer) when is_function(reducer, 2) do
+  try do
     __MODULE__
     |> where([instance], is_nil(instance.error) and is_nil(instance.metadata))
     |> select([instance], %{
       contract_address_hash: instance.token_contract_address_hash,
       token_id: instance.token_id
     })
     |> Repo.stream_reduce(initial, reducer)
+  rescue
+    e in Ecto.QueryError ->
+      {:error, e}
+  end
 end
🧹 Nitpick comments (3)
apps/explorer/test/explorer/chain/token/instance_test.exs (2)

155-208: Enhance pagination test coverage.

Consider adding these test cases:

  • Empty result set scenario
  • Multiple page sizes to verify pagination behavior
  • Edge case where token_id equals the page size

Example structure for empty result test:

test "returns empty list when no tokens exist" do
  token_contract_address = insert(:contract_address)
  token = insert(:token, contract_address: token_contract_address, type: "ERC-721")
  
  paging_options = %PagingOptions{page_size: 10}
  
  result =
    token_contract_address.hash
    |> Instance.address_to_unique_tokens(token, paging_options: paging_options)
    |> Enum.to_list()
    
  assert result == []
end

231-284: Add batch-specific test scenarios.

Consider adding these batch-specific test cases:

  • Mixed valid and invalid records in the same batch
  • Empty batch handling
  • Large batch size performance test

Example structure for mixed batch test:

test "handles mixed valid and invalid records in batch" do
  token = insert(:token)
  
  params = [
    %{
      token_id: 1,
      token_contract_address_hash: token.contract_address_hash,
      metadata: %{uri: "http://example.com"}
    },
    %{
      token_id: "invalid",
      token_contract_address_hash: token.contract_address_hash,
      metadata: %{uri: "http://example.com"}
    }
  ]
  
  results = Instance.batch_upsert_token_instances(params)
  assert length(results) == 1
  [result] = results
  assert result.token_id == Decimal.new(1)
end
apps/indexer/lib/indexer/fetcher/token_instance/refetch.ex (1)

2-4: Enhance module documentation.

The current documentation is brief. Consider adding more details about:

  • When and why token instances are marked for re-fetching
  • The relationship with BufferedTask behavior
  • The permanent restart strategy and its implications
 @moduledoc """
-  Fetches information about a token instance, which is marked to be re-fetched.
+  Fetches information about token instances that are marked for re-fetching.
+
+  This module implements the `BufferedTask` behavior to process token instances
+  in batches. It uses a permanent restart strategy to ensure continuous operation
+  even after failures.
+
+  Token instances are marked for re-fetching when their metadata needs to be
+  updated or when previous fetch attempts failed.
 """
📜 Review details

Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between a47e890 and 6764de3.

📒 Files selected for processing (6)
  • apps/explorer/lib/explorer/chain.ex (0 hunks)
  • apps/explorer/lib/explorer/chain/token/instance.ex (18 hunks)
  • apps/explorer/test/explorer/chain/token/instance_test.exs (2 hunks)
  • apps/indexer/lib/indexer/application.ex (3 hunks)
  • apps/indexer/lib/indexer/fetcher/on_demand/nft_collection_metadata_refetch.ex (1 hunks)
  • apps/indexer/lib/indexer/fetcher/token_instance/refetch.ex (1 hunks)
💤 Files with no reviewable changes (1)
  • apps/explorer/lib/explorer/chain.ex
🚧 Files skipped from review as they are similar to previous changes (1)
  • apps/indexer/lib/indexer/fetcher/on_demand/nft_collection_metadata_refetch.ex
🧰 Additional context used
📓 Path-based instructions (1)
`**/*.ex`: This is a module written in Elixir. For changed f...

**/*.ex: This is a module written in Elixir. For changed functions here are general guidelines for reviewing:

  1. Context Awareness:
  • Before raising a comment about missing specifications (@SPEC) or documentation (@doc), verify whether they exist in the corresponding code.
  • Check if a single @SPEC or @doc covers multiple clauses of a function (e.g., functions with different argument patterns). Do not raise a comment if one shared @SPEC or @doc exists.
  1. Behavior-Implementing Functions:
  • If a function is preceded by @impl, confirm that it implements a behavior. Do not raise comments for missing @SPEC or @doc if:
    • @SPEC is omitted (this is acceptable).
    • Documentation is omitted or provided in #-style comments.
  1. Predefined Functions:
  • For GenServer functions like child_spec/1 and start_link/1 do not raise comments for missing @SPEC or @doc.
  1. Documentation Format:
  • When reviewing @doc comments, ensure that it follows the following format:
@doc """
  This function does X, Y, and Z and this very first sentence of the comment must not be wrapped if the line exceeds 80 chars.

  The next lines in the documentation comment of any function must be wrapped
  after 80 chars (soft limit). It will allow to read documentation comments
  well in the code editors.

  ## Parameters
  - `param1`: Description of param1
  - `param2`: Description of param2

  ## Returns
  - what the function returns in one case
  - what the function returns if there is more than one possible outcome
"""
  1. Exceptions for Private Functions:
  • For private functions:
    • Skip comments about missing @SPEC if the function has no arguments.
    • Ensure no duplicate @SPEC is required for multiple clauses.

Example Scenarios to Reduce False Positives

  1. Single Specification and Documentation for Multiple Clauses (i.e., different implementations based on pattern matching of arguments):
@doc """
  Provides an example function.

  ## Parameters
  - `integer`: An integer input.

  ## Returns
  - A string result.
"""
@spec example_function(integer()) :: String.t()
def example_function(1), do: "One"
def example_function(2), do: "Two"
  • Do not raise a comment here, as both @SPEC and @doc cover all clauses.
  1. Behavior-Implementing Function:
# Handles an example call.
@impl GenServer
def handle_call(:example, _from, state) do
  {:reply, :ok, state}
end
  • Do not raise a comment for missing @SPEC.
  • Ensure the documentation is valid as a # comment.
  1. Function with No Arguments:
defp no_arg_function, do: :ok
  • Do not raise a comment for a missing @SPEC.
  • apps/indexer/lib/indexer/application.ex
  • apps/indexer/lib/indexer/fetcher/token_instance/refetch.ex
  • apps/explorer/lib/explorer/chain/token/instance.ex
⏰ Context from checks skipped due to timeout of 90000ms (3)
  • GitHub Check: Indexer Tests (default)
  • GitHub Check: Explorer Tests (default)
  • GitHub Check: Blockscout Web Tests (default)
🔇 Additional comments (9)
apps/explorer/test/explorer/chain/token/instance_test.exs (2)

85-153: Add edge cases to upsert_token_instance tests.

While the current test coverage is good, consider adding these edge cases:

  • Invalid token_id formats (non-numeric, negative)
  • Non-existent token contract address
  • Malformed metadata structure

Example structure for additional test:

test "fails with invalid token_id format" do
  token = insert(:token)
  
  params = %{
    token_id: "invalid",
    token_contract_address_hash: token.contract_address_hash,
    metadata: %{uri: "http://example.com"}
  }
  
  {:error, changeset} = Instance.upsert_token_instance(params)
  assert {"is invalid", _} = changeset.errors[:token_id]
end

210-229: Add error case tests for NFT instance retrieval.

The current test only covers the successful case. Add tests for error scenarios:

  • Non-existent token_id
  • Invalid token contract address
  • Invalid token_id format
apps/indexer/lib/indexer/fetcher/token_instance/refetch.ex (2)

19-26: LGTM!

The child specification is well-implemented, correctly merging default options with initialization options.


46-53: LGTM!

The defaults function is well-implemented with appropriate configuration options and fallback values.

apps/indexer/lib/indexer/application.ex (3)

11-11: LGTM!

The new aliases are well-organized and follow consistent naming conventions.

Also applies to: 14-14


37-40: LGTM!

The pool size calculation for the new token instance refetch follows the established pattern consistently.


52-52: LGTM!

The new supervisors are properly configured and added in a logical order to the supervision tree.

Also applies to: 54-54

apps/explorer/lib/explorer/chain/token/instance.ex (2)

11-22: LGTM! Well-structured module attributes and types.

The new module attributes and types are well-defined with clear purposes:

  • Default paging options for consistent pagination
  • API type for function specifications
  • Constant for marking tokens for refetch

1278-1287: LGTM! Well-implemented batch upsert function.

The function properly handles:

  • Changeset validation
  • Error handling
  • Conflict resolution with appropriate retry logic

Comment on lines +38 to +44
@impl BufferedTask
def run(token_instances, _) when is_list(token_instances) do
token_instances
|> batch_fetch_instances()

:ok
end
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

Add error handling for batch_fetch_instances.

The function should handle potential errors from batch_fetch_instances/1 and propagate them appropriately.

 @impl BufferedTask
 def run(token_instances, _) when is_list(token_instances) do
-  token_instances
-  |> batch_fetch_instances()
-
-  :ok
+  case batch_fetch_instances(token_instances) do
+    :ok -> :ok
+    {:error, reason} ->
+      Logger.error("Failed to fetch token instances: #{inspect(reason)}")
+      {:error, reason}
+  end
 end
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
@impl BufferedTask
def run(token_instances, _) when is_list(token_instances) do
token_instances
|> batch_fetch_instances()
:ok
end
@impl BufferedTask
def run(token_instances, _) when is_list(token_instances) do
case batch_fetch_instances(token_instances) do
:ok -> :ok
{:error, reason} ->
Logger.error("Failed to fetch token instances: #{inspect(reason)}")
{:error, reason}
end
end

Comment on lines +1073 to +1086
@spec stream_token_instances_marked_to_refetch(
initial :: accumulator,
reducer :: (entry :: map(), accumulator -> accumulator)
) :: {:ok, accumulator}
when accumulator: term()
def stream_token_instances_marked_to_refetch(initial, reducer) when is_function(reducer, 2) do
__MODULE__
|> where([instance], instance.error == ^@marked_to_refetch and is_nil(instance.metadata))
|> select([instance], %{
contract_address_hash: instance.token_contract_address_hash,
token_id: instance.token_id
})
|> Repo.stream_reduce(initial, reducer)
end
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

Add error handling for database stream operations.

Similar to stream_token_instances_with_unfetched_metadata, this function should handle potential database errors.

Apply this diff to add error handling:

 def stream_token_instances_marked_to_refetch(initial, reducer) when is_function(reducer, 2) do
+  try do
     __MODULE__
     |> where([instance], instance.error == ^@marked_to_refetch and is_nil(instance.metadata))
     |> select([instance], %{
       contract_address_hash: instance.token_contract_address_hash,
       token_id: instance.token_id
     })
     |> Repo.stream_reduce(initial, reducer)
+  rescue
+    e in Ecto.QueryError ->
+      {:error, e}
+  end
 end
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
@spec stream_token_instances_marked_to_refetch(
initial :: accumulator,
reducer :: (entry :: map(), accumulator -> accumulator)
) :: {:ok, accumulator}
when accumulator: term()
def stream_token_instances_marked_to_refetch(initial, reducer) when is_function(reducer, 2) do
__MODULE__
|> where([instance], instance.error == ^@marked_to_refetch and is_nil(instance.metadata))
|> select([instance], %{
contract_address_hash: instance.token_contract_address_hash,
token_id: instance.token_id
})
|> Repo.stream_reduce(initial, reducer)
end
@spec stream_token_instances_marked_to_refetch(
initial :: accumulator,
reducer :: (entry :: map(), accumulator -> accumulator)
) :: {:ok, accumulator}
when accumulator: term()
def stream_token_instances_marked_to_refetch(initial, reducer) when is_function(reducer, 2) do
try do
__MODULE__
|> where([instance], instance.error == ^@marked_to_refetch and is_nil(instance.metadata))
|> select([instance], %{
contract_address_hash: instance.token_contract_address_hash,
token_id: instance.token_id
})
|> Repo.stream_reduce(initial, reducer)
rescue
e in Ecto.QueryError ->
{:error, e}
end
end

@vbaranov vbaranov force-pushed the vb-nft-collection-trigger-metadata-refetch-admin-api-endpoint branch from 6764de3 to e69ee8a Compare February 6, 2025 09:28
Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 0

♻️ Duplicate comments (1)
apps/block_scout_web/test/block_scout_web/controllers/api/v2/token_controller_test.exs (1)

1752-1752: 🛠️ Refactor suggestion

Replace sleep with proper test synchronization.

The use of :timer.sleep(100) can make tests flaky. Consider using ExUnit's assert_receive or a similar mechanism to properly wait for the async operation to complete.

-:timer.sleep(100)
+assert_receive {:chain_event, :fetched_token_instance_metadata, :on_demand, _}, 1000
🧹 Nitpick comments (4)
config/runtime.exs (1)

906-909: LGTM! Consider documenting the environment variables.

The configuration for the token instance refetch module is well-structured with appropriate defaults for concurrency and batch size.

Consider adding these environment variables to the documentation to help users understand how to tune the refetch performance:

  • INDEXER_TOKEN_INSTANCE_REFETCH_CONCURRENCY
  • INDEXER_TOKEN_INSTANCE_REFETCH_BATCH_SIZE
apps/block_scout_web/test/block_scout_web/controllers/api/v2/token_controller_test.exs (1)

1710-1728: Consider adding more test cases for API key validation.

The test setup is good but could be enhanced by adding test cases for:

  1. Invalid API key
  2. Malformed API key

Example test case:

test "returns 401 with invalid API key", %{conn: conn} do
  token = insert(:token, type: "ERC-721")
  
  request = post(
    conn,
    "/api/v2/tokens/#{token.contract_address.hash}/instances/trigger-nft-collection-metadata-refetch",
    %{"api_key" => "invalid_key"}
  )
  
  assert %{"message" => "Wrong API key"} = json_response(request, 401)
end
apps/explorer/test/explorer/chain/token/instance_test.exs (1)

85-138: Add test cases for edge cases in token pagination.

The current test verifies basic pagination functionality, but consider adding test cases for:

  • Empty result set
  • Single token instance
  • Maximum page size
  • Invalid token_id in paging options
apps/explorer/lib/explorer/chain/token/instance.ex (1)

1030-1050: Add error handling for database timeouts.

The function could timeout when processing large NFT collections. Consider:

  1. Adding a try/rescue block to handle Ecto.QueryError
  2. Implementing batch processing for large collections
📜 Review details

Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 6764de3 and e69ee8a.

📒 Files selected for processing (22)
  • apps/block_scout_web/lib/block_scout_web/controllers/api/v2/token_controller.ex (5 hunks)
  • apps/block_scout_web/lib/block_scout_web/controllers/tokens/instance/holder_controller.ex (2 hunks)
  • apps/block_scout_web/lib/block_scout_web/controllers/tokens/instance/metadata_controller.ex (2 hunks)
  • apps/block_scout_web/lib/block_scout_web/controllers/tokens/instance/transfer_controller.ex (2 hunks)
  • apps/block_scout_web/lib/block_scout_web/controllers/tokens/inventory_controller.ex (2 hunks)
  • apps/block_scout_web/lib/block_scout_web/routers/tokens_api_v2_router.ex (1 hunks)
  • apps/block_scout_web/lib/block_scout_web/views/api/v2/address_view.ex (1 hunks)
  • apps/block_scout_web/test/block_scout_web/controllers/api/v2/token_controller_test.exs (2 hunks)
  • apps/explorer/lib/explorer/chain.ex (0 hunks)
  • apps/explorer/lib/explorer/chain/token/instance.ex (18 hunks)
  • apps/explorer/test/explorer/chain/token/instance_test.exs (2 hunks)
  • apps/explorer/test/explorer/chain_test.exs (0 hunks)
  • apps/indexer/lib/indexer/application.ex (3 hunks)
  • apps/indexer/lib/indexer/fetcher/on_demand/nft_collection_metadata_refetch.ex (1 hunks)
  • apps/indexer/lib/indexer/fetcher/token_instance/helper.ex (2 hunks)
  • apps/indexer/lib/indexer/fetcher/token_instance/realtime.ex (1 hunks)
  • apps/indexer/lib/indexer/fetcher/token_instance/refetch.ex (1 hunks)
  • apps/indexer/lib/indexer/fetcher/token_instance/retry.ex (2 hunks)
  • apps/indexer/lib/indexer/fetcher/token_instance/sanitize.ex (3 hunks)
  • apps/indexer/test/indexer/fetcher/token_instance/refetch_test.exs (1 hunks)
  • config/runtime.exs (2 hunks)
  • docker-compose/envs/common-blockscout.env (2 hunks)
💤 Files with no reviewable changes (2)
  • apps/explorer/test/explorer/chain_test.exs
  • apps/explorer/lib/explorer/chain.ex
🚧 Files skipped from review as they are similar to previous changes (16)
  • apps/block_scout_web/lib/block_scout_web/controllers/tokens/inventory_controller.ex
  • apps/block_scout_web/lib/block_scout_web/controllers/tokens/instance/metadata_controller.ex
  • apps/block_scout_web/lib/block_scout_web/routers/tokens_api_v2_router.ex
  • apps/indexer/lib/indexer/fetcher/token_instance/retry.ex
  • docker-compose/envs/common-blockscout.env
  • apps/indexer/lib/indexer/fetcher/token_instance/realtime.ex
  • apps/indexer/lib/indexer/fetcher/token_instance/helper.ex
  • apps/indexer/lib/indexer/fetcher/on_demand/nft_collection_metadata_refetch.ex
  • apps/block_scout_web/lib/block_scout_web/views/api/v2/address_view.ex
  • apps/block_scout_web/lib/block_scout_web/controllers/tokens/instance/transfer_controller.ex
  • apps/indexer/test/indexer/fetcher/token_instance/refetch_test.exs
  • apps/block_scout_web/lib/block_scout_web/controllers/tokens/instance/holder_controller.ex
  • apps/indexer/lib/indexer/fetcher/token_instance/sanitize.ex
  • apps/indexer/lib/indexer/application.ex
  • apps/indexer/lib/indexer/fetcher/token_instance/refetch.ex
  • apps/block_scout_web/lib/block_scout_web/controllers/api/v2/token_controller.ex
🧰 Additional context used
📓 Path-based instructions (1)
`**/*.ex`: This is a module written in Elixir. For changed f...

**/*.ex: This is a module written in Elixir. For changed functions here are general guidelines for reviewing:

  1. Context Awareness:
  • Before raising a comment about missing specifications (@SPEC) or documentation (@doc), verify whether they exist in the corresponding code.
  • Check if a single @SPEC or @doc covers multiple clauses of a function (e.g., functions with different argument patterns). Do not raise a comment if one shared @SPEC or @doc exists.
  1. Behavior-Implementing Functions:
  • If a function is preceded by @impl, confirm that it implements a behavior. Do not raise comments for missing @SPEC or @doc if:
    • @SPEC is omitted (this is acceptable).
    • Documentation is omitted or provided in #-style comments.
  1. Predefined Functions:
  • For GenServer functions like child_spec/1 and start_link/1 do not raise comments for missing @SPEC or @doc.
  1. Documentation Format:
  • When reviewing @doc comments, ensure that it follows the following format:
@doc """
  This function does X, Y, and Z and this very first sentence of the comment must not be wrapped if the line exceeds 80 chars.

  The next lines in the documentation comment of any function must be wrapped
  after 80 chars (soft limit). It will allow to read documentation comments
  well in the code editors.

  ## Parameters
  - `param1`: Description of param1
  - `param2`: Description of param2

  ## Returns
  - what the function returns in one case
  - what the function returns if there is more than one possible outcome
"""
  1. Exceptions for Private Functions:
  • For private functions:
    • Skip comments about missing @SPEC if the function has no arguments.
    • Ensure no duplicate @SPEC is required for multiple clauses.

Example Scenarios to Reduce False Positives

  1. Single Specification and Documentation for Multiple Clauses (i.e., different implementations based on pattern matching of arguments):
@doc """
  Provides an example function.

  ## Parameters
  - `integer`: An integer input.

  ## Returns
  - A string result.
"""
@spec example_function(integer()) :: String.t()
def example_function(1), do: "One"
def example_function(2), do: "Two"
  • Do not raise a comment here, as both @SPEC and @doc cover all clauses.
  1. Behavior-Implementing Function:
# Handles an example call.
@impl GenServer
def handle_call(:example, _from, state) do
  {:reply, :ok, state}
end
  • Do not raise a comment for missing @SPEC.
  • Ensure the documentation is valid as a # comment.
  1. Function with No Arguments:
defp no_arg_function, do: :ok
  • Do not raise a comment for a missing @SPEC.
  • apps/explorer/lib/explorer/chain/token/instance.ex
🔇 Additional comments (14)
config/runtime.exs (1)

855-857: LGTM! Supervisor configuration looks good.

The configuration for the token instance refetch supervisor follows the established pattern and correctly uses environment variables for control.

apps/block_scout_web/test/block_scout_web/controllers/api/v2/token_controller_test.exs (2)

1730-1762: LGTM! Comprehensive test coverage for the success case.

The test thoroughly verifies:

  1. The endpoint returns 200 OK
  2. All token instances are marked for refetch
  3. The metadata is cleared and error is set correctly

1764-1772: LGTM! Good security test coverage.

The test properly verifies that the endpoint requires authentication by checking for a 401 response when no API key is provided.

apps/explorer/test/explorer/chain/token/instance_test.exs (2)

140-159: Add error case tests for NFT instance retrieval.

The current test only covers the successful retrieval case. Consider adding tests for:

  • Non-existent token_id
  • Invalid token contract address
  • Invalid token_id format (non-numeric)

161-214: LGTM!

The test cases thoroughly cover the batch upsert functionality, including error handling and metadata updates.

apps/explorer/lib/explorer/chain/token/instance.ex (9)

1052-1068: Add error handling for database operations.

The stream operation could fail due to database errors. Consider wrapping it in a try/rescue block.


1070-1086: Add error handling for database stream operations.

Similar to stream_token_instances_with_unfetched_metadata, this function should handle potential database errors.


1088-1111: LGTM!

The function is well-documented and correctly implements the metadata check functionality.


1113-1168: LGTM!

The function implements a sophisticated error handling system with prioritization and is well-documented.


1171-1204: LGTM!

The function correctly implements pagination and association preloading with clear documentation.


1206-1233: LGTM!

The function correctly implements NFT instance retrieval with proper error handling.


1235-1272: Add error handling for database queries.

The owner query operations could fail due to database errors. Consider adding proper error handling.


1274-1287: LGTM!

The function correctly implements batch upsert with proper conflict handling.


1289-1341: LGTM!

The function implements a robust retry mechanism with exponential backoff and proper banning logic.

@vbaranov vbaranov force-pushed the vb-nft-collection-trigger-metadata-refetch-admin-api-endpoint branch from e69ee8a to b4ed7b0 Compare February 6, 2025 10:42
Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

🧹 Nitpick comments (3)
apps/explorer/test/explorer/chain/token/instance_test.exs (2)

85-138: Add test cases for edge scenarios.

The test covers the basic pagination case, but consider adding tests for:

  • Empty result set
  • Invalid token_ids
  • Maximum page size
  • Boundary conditions for token_id values
 describe "address_to_unique_tokens/2" do
   test "unique tokens can be paginated through token_id" do
     # existing test...
   end
+
+  test "returns empty list when no tokens exist" do
+    token_contract_address = insert(:contract_address)
+    token = insert(:token, contract_address: token_contract_address, type: "ERC-721")
+    
+    paging_options = %PagingOptions{page_size: 1}
+    
+    result =
+      token_contract_address.hash
+      |> Instance.address_to_unique_tokens(token, paging_options: paging_options)
+    
+    assert result == []
+  end
+
+  test "handles maximum token_id values" do
+    token_contract_address = insert(:contract_address)
+    token = insert(:token, contract_address: token_contract_address, type: "ERC-721")
+    
+    insert(
+      :token_instance,
+      token_contract_address_hash: token_contract_address.hash,
+      token_id: Decimal.new("115792089237316195423570985008687907853269984665640564039457584007913129639935")
+    )
+    
+    paging_options = %PagingOptions{page_size: 1}
+    
+    result =
+      token_contract_address.hash
+      |> Instance.address_to_unique_tokens(token, paging_options: paging_options)
+      |> Enum.map(& &1.token_id)
+    
+    assert length(result) == 1
+  end
 end

161-214: Add concurrent upsert test.

Consider adding a test case for concurrent batch upserts to ensure race conditions are handled correctly.

 describe "batch_upsert_token_instances/1" do
   # existing tests...
+
+  test "handles concurrent upserts correctly" do
+    token = insert(:token)
+    
+    params = %{
+      token_id: 1,
+      token_contract_address_hash: token.contract_address_hash,
+      metadata: %{uri: "http://example.com"}
+    }
+    
+    tasks = for _ <- 1..5 do
+      Task.async(fn ->
+        Instance.batch_upsert_token_instances([params])
+      end)
+    end
+    
+    results = Task.await_many(tasks)
+    
+    assert length(Enum.uniq_by(List.flatten(results), & &1.token_id)) == 1
+  end
 end
apps/explorer/lib/explorer/chain/token/instance.ex (1)

1030-1050: Add transaction wrapper for atomicity.

Consider wrapping the update operation in a transaction to ensure atomicity.

 def mark_nft_collection_to_refetch(token_contract_address_hash) do
   now = DateTime.utc_now()
 
-  Repo.update_all(
-    from(instance in __MODULE__,
-      where: instance.token_contract_address_hash == ^token_contract_address_hash
-    ),
-    [set: [metadata: nil, error: @marked_to_refetch, updated_at: now]],
-    timeout: @timeout
-  )
+  Repo.transaction(fn ->
+    Repo.update_all(
+      from(instance in __MODULE__,
+        where: instance.token_contract_address_hash == ^token_contract_address_hash
+      ),
+      [set: [metadata: nil, error: @marked_to_refetch, updated_at: now]],
+      timeout: @timeout
+    )
+  end)
 end
📜 Review details

Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between e69ee8a and b4ed7b0.

📒 Files selected for processing (4)
  • apps/explorer/lib/explorer/chain.ex (0 hunks)
  • apps/explorer/lib/explorer/chain/token/instance.ex (18 hunks)
  • apps/explorer/test/explorer/chain/token/instance_test.exs (2 hunks)
  • apps/indexer/lib/indexer/fetcher/on_demand/nft_collection_metadata_refetch.ex (1 hunks)
💤 Files with no reviewable changes (1)
  • apps/explorer/lib/explorer/chain.ex
🚧 Files skipped from review as they are similar to previous changes (1)
  • apps/indexer/lib/indexer/fetcher/on_demand/nft_collection_metadata_refetch.ex
🧰 Additional context used
📓 Path-based instructions (1)
`**/*.ex`: This is a module written in Elixir. For changed f...

**/*.ex: This is a module written in Elixir. For changed functions here are general guidelines for reviewing:

  1. Context Awareness:
  • Before raising a comment about missing specifications (@SPEC) or documentation (@doc), verify whether they exist in the corresponding code.
  • Check if a single @SPEC or @doc covers multiple clauses of a function (e.g., functions with different argument patterns). Do not raise a comment if one shared @SPEC or @doc exists.
  1. Behavior-Implementing Functions:
  • If a function is preceded by @impl, confirm that it implements a behavior. Do not raise comments for missing @SPEC or @doc if:
    • @SPEC is omitted (this is acceptable).
    • Documentation is omitted or provided in #-style comments.
  1. Predefined Functions:
  • For GenServer functions like child_spec/1 and start_link/1 do not raise comments for missing @SPEC or @doc.
  1. Documentation Format:
  • When reviewing @doc comments, ensure that it follows the following format:
@doc """
  This function does X, Y, and Z and this very first sentence of the comment must not be wrapped if the line exceeds 80 chars.

  The next lines in the documentation comment of any function must be wrapped
  after 80 chars (soft limit). It will allow to read documentation comments
  well in the code editors.

  ## Parameters
  - `param1`: Description of param1
  - `param2`: Description of param2

  ## Returns
  - what the function returns in one case
  - what the function returns if there is more than one possible outcome
"""
  1. Exceptions for Private Functions:
  • For private functions:
    • Skip comments about missing @SPEC if the function has no arguments.
    • Ensure no duplicate @SPEC is required for multiple clauses.

Example Scenarios to Reduce False Positives

  1. Single Specification and Documentation for Multiple Clauses (i.e., different implementations based on pattern matching of arguments):
@doc """
  Provides an example function.

  ## Parameters
  - `integer`: An integer input.

  ## Returns
  - A string result.
"""
@spec example_function(integer()) :: String.t()
def example_function(1), do: "One"
def example_function(2), do: "Two"
  • Do not raise a comment here, as both @SPEC and @doc cover all clauses.
  1. Behavior-Implementing Function:
# Handles an example call.
@impl GenServer
def handle_call(:example, _from, state) do
  {:reply, :ok, state}
end
  • Do not raise a comment for missing @SPEC.
  • Ensure the documentation is valid as a # comment.
  1. Function with No Arguments:
defp no_arg_function, do: :ok
  • Do not raise a comment for a missing @SPEC.
  • apps/explorer/lib/explorer/chain/token/instance.ex
⏰ Context from checks skipped due to timeout of 90000ms (5)
  • GitHub Check: Indexer Tests (default)
  • GitHub Check: Explorer Tests (default)
  • GitHub Check: Blockscout Web Tests (default)
  • GitHub Check: Dialyzer static analysis (default)
  • GitHub Check: Analyze (javascript)
🔇 Additional comments (3)
apps/explorer/test/explorer/chain/token/instance_test.exs (1)

140-159: Add error case tests for NFT instance retrieval.

The test only covers the successful retrieval case.

apps/explorer/lib/explorer/chain/token/instance.ex (2)

15-22: LGTM! Well-defined constants and types.

The constants and types are clearly defined with appropriate values.


1055-1168: Add error handling for database stream operations.

The stream operations could fail due to database errors.

Comment on lines +1193 to +1204
@spec address_to_unique_tokens(Hash.Address.t(), Token.t(), [paging_options | api?]) :: [__MODULE__.t()]
def address_to_unique_tokens(contract_address_hash, token, options \\ []) do
paging_options = Keyword.get(options, :paging_options, @default_paging_options)

contract_address_hash
|> __MODULE__.address_to_unique_token_instances_query()
|> __MODULE__.page_token_instance(paging_options)
|> limit(^paging_options.page_size)
|> preload([_], owner: [:names, :smart_contract, ^Implementation.proxy_implementations_association()])
|> Chain.select_repo(options).all()
|> Enum.map(&put_owner_to_token_instance(&1, token, options))
end
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Add error handling for database queries.

The function should handle potential database errors and invalid inputs.

 def address_to_unique_tokens(contract_address_hash, token, options \\ []) do
   paging_options = Keyword.get(options, :paging_options, @default_paging_options)
 
-  contract_address_hash
-  |> __MODULE__.address_to_unique_token_instances_query()
-  |> __MODULE__.page_token_instance(paging_options)
-  |> limit(^paging_options.page_size)
-  |> preload([_], owner: [:names, :smart_contract, ^Implementation.proxy_implementations_association()])
-  |> Chain.select_repo(options).all()
-  |> Enum.map(&put_owner_to_token_instance(&1, token, options))
+  try do
+    contract_address_hash
+    |> __MODULE__.address_to_unique_token_instances_query()
+    |> __MODULE__.page_token_instance(paging_options)
+    |> limit(^paging_options.page_size)
+    |> preload([_], owner: [:names, :smart_contract, ^Implementation.proxy_implementations_association()])
+    |> Chain.select_repo(options).all()
+    |> Enum.map(&put_owner_to_token_instance(&1, token, options))
+  rescue
+    e in Ecto.QueryError ->
+      {:error, e}
+    _ ->
+      {:error, :unknown_error}
+  end
 end
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
@spec address_to_unique_tokens(Hash.Address.t(), Token.t(), [paging_options | api?]) :: [__MODULE__.t()]
def address_to_unique_tokens(contract_address_hash, token, options \\ []) do
paging_options = Keyword.get(options, :paging_options, @default_paging_options)
contract_address_hash
|> __MODULE__.address_to_unique_token_instances_query()
|> __MODULE__.page_token_instance(paging_options)
|> limit(^paging_options.page_size)
|> preload([_], owner: [:names, :smart_contract, ^Implementation.proxy_implementations_association()])
|> Chain.select_repo(options).all()
|> Enum.map(&put_owner_to_token_instance(&1, token, options))
end
@spec address_to_unique_tokens(Hash.Address.t(), Token.t(), [paging_options | api?]) :: [__MODULE__.t()]
def address_to_unique_tokens(contract_address_hash, token, options \\ []) do
paging_options = Keyword.get(options, :paging_options, @default_paging_options)
try do
contract_address_hash
|> __MODULE__.address_to_unique_token_instances_query()
|> __MODULE__.page_token_instance(paging_options)
|> limit(^paging_options.page_size)
|> preload([_], owner: [:names, :smart_contract, ^Implementation.proxy_implementations_association()])
|> Chain.select_repo(options).all()
|> Enum.map(&put_owner_to_token_instance(&1, token, options))
rescue
e in Ecto.QueryError ->
{:error, e}
_ ->
{:error, :unknown_error}
end
end

Comment on lines +1278 to +1311
@spec batch_upsert_token_instances([map()]) :: [__MODULE__.t()]
def batch_upsert_token_instances(params_list) do
params_to_insert = __MODULE__.adjust_insert_params(params_list)

{_, result} =
Repo.insert_all(__MODULE__, params_to_insert,
on_conflict: token_instance_metadata_on_conflict(),
conflict_target: [:token_id, :token_contract_address_hash],
returning: true
)

result
end

defp token_instance_metadata_on_conflict do
from(
token_instance in __MODULE__,
update: [
set: [
metadata: fragment("EXCLUDED.metadata"),
error: fragment("EXCLUDED.error"),
owner_updated_at_block: token_instance.owner_updated_at_block,
owner_updated_at_log_index: token_instance.owner_updated_at_log_index,
owner_address_hash: token_instance.owner_address_hash,
inserted_at: fragment("LEAST(?, EXCLUDED.inserted_at)", token_instance.inserted_at),
updated_at: fragment("GREATEST(?, EXCLUDED.updated_at)", token_instance.updated_at),
retries_count: token_instance.retries_count + 1,
refetch_after: fragment("EXCLUDED.refetch_after"),
is_banned: fragment("EXCLUDED.is_banned")
]
],
where: is_nil(token_instance.metadata)
)
end
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Add input validation for batch upsert params.

Consider validating the input params before processing to ensure data integrity.

 def batch_upsert_token_instances(params_list) do
+  unless is_list(params_list) do
+    raise ArgumentError, "params_list must be a list"
+  end
+
+  Enum.each(params_list, fn params ->
+    unless is_map(params) do
+      raise ArgumentError, "each item in params_list must be a map"
+    end
+
+    required_keys = [:token_id, :token_contract_address_hash]
+    missing_keys = Enum.reject(required_keys, &Map.has_key?(params, &1))
+
+    unless Enum.empty?(missing_keys) do
+      raise ArgumentError, "missing required keys: #{inspect(missing_keys)}"
+    end
+  end)
+
   params_to_insert = __MODULE__.adjust_insert_params(params_list)
 
   {_, result} =
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
@spec batch_upsert_token_instances([map()]) :: [__MODULE__.t()]
def batch_upsert_token_instances(params_list) do
params_to_insert = __MODULE__.adjust_insert_params(params_list)
{_, result} =
Repo.insert_all(__MODULE__, params_to_insert,
on_conflict: token_instance_metadata_on_conflict(),
conflict_target: [:token_id, :token_contract_address_hash],
returning: true
)
result
end
defp token_instance_metadata_on_conflict do
from(
token_instance in __MODULE__,
update: [
set: [
metadata: fragment("EXCLUDED.metadata"),
error: fragment("EXCLUDED.error"),
owner_updated_at_block: token_instance.owner_updated_at_block,
owner_updated_at_log_index: token_instance.owner_updated_at_log_index,
owner_address_hash: token_instance.owner_address_hash,
inserted_at: fragment("LEAST(?, EXCLUDED.inserted_at)", token_instance.inserted_at),
updated_at: fragment("GREATEST(?, EXCLUDED.updated_at)", token_instance.updated_at),
retries_count: token_instance.retries_count + 1,
refetch_after: fragment("EXCLUDED.refetch_after"),
is_banned: fragment("EXCLUDED.is_banned")
]
],
where: is_nil(token_instance.metadata)
)
end
@spec batch_upsert_token_instances([map()]) :: [__MODULE__.t()]
def batch_upsert_token_instances(params_list) do
+ unless is_list(params_list) do
+ raise ArgumentError, "params_list must be a list"
+ end
+
+ Enum.each(params_list, fn params ->
+ unless is_map(params) do
+ raise ArgumentError, "each item in params_list must be a map"
+ end
+
+ required_keys = [:token_id, :token_contract_address_hash]
+ missing_keys = Enum.reject(required_keys, &Map.has_key?(params, &1))
+
+ unless Enum.empty?(missing_keys) do
+ raise ArgumentError, "missing required keys: #{inspect(missing_keys)}"
+ end
+ end)
+
params_to_insert = __MODULE__.adjust_insert_params(params_list)
{_, result} =
Repo.insert_all(__MODULE__, params_to_insert,
on_conflict: token_instance_metadata_on_conflict(),
conflict_target: [:token_id, :token_contract_address_hash],
returning: true
)
result
end
defp token_instance_metadata_on_conflict do
from(
token_instance in __MODULE__,
update: [
set: [
metadata: fragment("EXCLUDED.metadata"),
error: fragment("EXCLUDED.error"),
owner_updated_at_block: token_instance.owner_updated_at_block,
owner_updated_at_log_index: token_instance.owner_updated_at_log_index,
owner_address_hash: token_instance.owner_address_hash,
inserted_at: fragment("LEAST(?, EXCLUDED.inserted_at)", token_instance.inserted_at),
updated_at: fragment("GREATEST(?, EXCLUDED.updated_at)", token_instance.updated_at),
retries_count: token_instance.retries_count + 1,
refetch_after: fragment("EXCLUDED.refetch_after"),
is_banned: fragment("EXCLUDED.is_banned")
]
],
where: is_nil(token_instance.metadata)
)
end

@vbaranov vbaranov requested a review from nikitosing February 6, 2025 11:04
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

NFTs: refresh collection handle
2 participants