Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix(ingest/gc): add limit, add actual loop for iterating over batches #11809

Merged
merged 4 commits into from
Nov 6, 2024

Conversation

anshbansal
Copy link
Collaborator

@anshbansal anshbansal commented Nov 6, 2024

  • there was no actual loop iterating over batches, added that
  • references were not being deleted, added that
  • added a limit of number of entities to be deleted so a single run doesn't go endlessly for ever risking multiple runs starting concurrently

Checklist

  • The PR conforms to DataHub's Contributing Guideline (particularly Commit Message Format)
  • Links to related issues (if applicable)
  • Tests for the changes have been added/updated (if applicable)
  • Docs related to the changes have been added/updated (if applicable). If a new feature has been added a Usage Guide has been added for the same.
  • For any breaking change/potential downtime/deprecation/big changes an entry has been made in Updating DataHub

@github-actions github-actions bot added the ingestion PR or Issue related to the ingestion of metadata label Nov 6, 2024
Copy link

github-actions bot commented Nov 6, 2024

Hello @anshbansal 😄

Thank you so much for opening a pull request!

Image
You can check out your contributor card and see all your past stats here!

@@ -59,6 +59,9 @@ class SoftDeletedEntitiesCleanupConfig(ConfigModel):
default=None,
description="Query to filter entities",
)
limit_entities_delete: int = Field(
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we make this optional?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I

self.report.num_soft_deleted_entity_removed
<= self.config.limit_entities_delete
):
urns = list(
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You get here more items than the limit.
I think it would be better to only iterate over the get_urns_by_filter and stop when you hit the limit and only keep limit number of item in memory and in the list

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do you mean in the case where batch_size > limit?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What if I do this

         while (
-            self.report.num_soft_deleted_entity_removed
-            <= self.config.limit_entities_delete
+            self.config.limit_entities_delete <= 0
+            or self.report.num_soft_deleted_entity_removed
+            <= max(self.config.limit_entities_delete, self.config.batch_size)
         ):

10000, description="Max number of entities to delete."
)

runtime_limit_seconds: Optional[int] = Field(
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Think this makes more sense as minutes

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

probably true in reality but dev-ops is pretty used to dealing with seconds for lots of things, feel free to change it later but not a blocker imho

@david-leifker david-leifker merged commit 32878ab into master Nov 6, 2024
71 of 72 checks passed
@david-leifker david-leifker deleted the ab-add-limit-gc-source branch November 6, 2024 23:08
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
ingestion PR or Issue related to the ingestion of metadata
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants