Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Handle patch error messages with SDK #193

Merged
merged 153 commits into from
Jul 20, 2023
Merged
Show file tree
Hide file tree
Changes from 142 commits
Commits
Show all changes
153 commits
Select commit Hold shift + click to select a range
0b5dc74
started on integration testing, but needs more work
nh916 Jun 14, 2023
dc22415
cript.API.search removed typing for node_type for now
nh916 Jun 16, 2023
3b7dc08
test_material.py wrote integration test, but currently has issues pas…
nh916 Jun 16, 2023
6502936
adding a
nh916 Jun 16, 2023
cc313e1
posting to DB and getting it works, but deserialization doesn't
nh916 Jun 16, 2023
1adc8f7
posting to DB and getting it works, but deserialization doesn't
nh916 Jun 16, 2023
bd712bf
removed unneeded name changes
nh916 Jun 16, 2023
edc6b6a
wrote integration test for Project node
nh916 Jun 16, 2023
6ab3a8d
wrote integration test for collection node
nh916 Jun 16, 2023
ffebf9c
wrote integration test for experiment node
nh916 Jun 16, 2023
c2a02b0
wrote integration test for inventory node
nh916 Jun 16, 2023
66df277
massively cleaned up project integration test function
nh916 Jun 16, 2023
e7467b7
massively cleaned up collection integration test function
nh916 Jun 16, 2023
03c9215
removed unneeded comment
nh916 Jun 16, 2023
59e9952
added docstring to project integration test
nh916 Jun 16, 2023
c0b4a8f
added integration test for inventory
nh916 Jun 16, 2023
727054c
renaming project and collection node names for integration tests
nh916 Jun 16, 2023
4d23f0c
refactoring `test_material/test_integration_material()`
nh916 Jun 17, 2023
9c30b32
wrote integration test for simple process node
nh916 Jun 17, 2023
ab5bc37
created `complex_process_node` fixture
nh916 Jun 17, 2023
a7d363d
added `complex_process_node` fixture
nh916 Jun 17, 2023
d651298
wrote integration test for process
nh916 Jun 17, 2023
92bf798
wrote integration test for data node
nh916 Jun 17, 2023
8ddbce8
wrote integration test for computation node
nh916 Jun 17, 2023
3148f63
renaming project name for integration test
nh916 Jun 17, 2023
aa1af78
started on integration test for computation_process
nh916 Jun 17, 2023
489c635
worked on `test_integration_reference`
nh916 Jun 17, 2023
b6bd6a8
worked on `test_integration_condition`
nh916 Jun 17, 2023
439959d
wrote `test_integration_file`
nh916 Jun 17, 2023
7ab3d6e
make user email, orcid optional
InnocentBug Jun 20, 2023
2248505
deserializing within integration test to node
nh916 Jun 20, 2023
7d85588
checking node json vs api json to node to json
nh916 Jun 20, 2023
42f4712
patch invalid uids out
InnocentBug Jun 20, 2023
a412f00
Merge branch 'wip_integration_tests' of github.com:C-Accel-CRIPT/Pyth…
InnocentBug Jun 20, 2023
3b3c48a
Merge branch 'wip_integration_tests' of github.com:C-Accel-CRIPT/Pyth…
InnocentBug Jun 20, 2023
c5fdd49
made experiment integration test function DRY
nh916 Jun 20, 2023
4f6d14d
fixing `complex_process_node` fixture
nh916 Jun 20, 2023
c357d2d
fixed type hinting for user getters
nh916 Jun 20, 2023
92e5ea9
updating `complex_property_node`
nh916 Jun 20, 2023
b10cb4d
wrote `test_integration_material_property` but getting `CRIPTOrphaned…
nh916 Jun 20, 2023
be25ddc
wrote `test_integration_process_condition` but getting `CRIPTOrphaned…
nh916 Jun 20, 2023
5f69172
wrote `test_integration_material_ingredient` but getting `CRIPTOrphan…
nh916 Jun 20, 2023
a70fc6a
wrote `test_integration_quantity` but getting `CRIPTOrphanedProcessEr…
nh916 Jun 20, 2023
08dc6d8
updated `complex_equipment_node`
nh916 Jun 20, 2023
4798d33
wrote `test_integration_process_equipment` but it is getting `CRIPTNo…
nh916 Jun 20, 2023
275bc3c
formatted test files and fixture with black
nh916 Jun 20, 2023
26dc580
updated `complex_computational_forcefield_node` fixture
nh916 Jun 20, 2023
4865a48
wrote `test_integration_material_computational_forcefield` but gettin…
nh916 Jun 20, 2023
6200cbf
updated `complex_software_configuration_node` fixture
nh916 Jun 20, 2023
9462b76
commented out assertion in `integrate_nodes_helper`
nh916 Jun 20, 2023
99354e5
`test_integration_software_configuration` written correctly
nh916 Jun 20, 2023
90864ab
updated project name for `test_integration_software_configuration`
nh916 Jun 20, 2023
3f90db9
wrote `test_integration_algorithm` and working correctly right now
nh916 Jun 20, 2023
6a2af46
* updated `complex_parameter_node` fixture
nh916 Jun 20, 2023
e6ec469
* updated `complex_algorithm_node` fixture
nh916 Jun 20, 2023
b827368
wrote `test_integration_algorithm`
nh916 Jun 20, 2023
50dd1d8
wrote `test_integration_parameter`
nh916 Jun 20, 2023
e1b8935
upgraded `complex_citation_node` fixture
nh916 Jun 20, 2023
4d7259f
wrote `test_integration_citation`
nh916 Jun 20, 2023
5687cdc
changed order of the print statements to make more sense
nh916 Jun 20, 2023
3c5fc39
save
nh916 Jun 23, 2023
199e1a2
trying compare JSONs for what we sent and recieved
nh916 Jun 26, 2023
29854f9
removing `try` `catch` block to handle API duplicate projects errors
nh916 Jun 26, 2023
0dab56d
deepDiff with `exclude_regex_paths` not working for comparison
nh916 Jun 26, 2023
2fe8aa5
deepDiff catching the correct differences
nh916 Jun 26, 2023
a26f3cd
deepDiff catching the correct differences
nh916 Jun 26, 2023
a61548a
renaming the integration project for experiment so there is no duplic…
nh916 Jun 26, 2023
f73fdcf
updated docstrings for `integrate_nodes_helper` helper function
nh916 Jun 28, 2023
7fb9d80
fixed `test_integration_computational_process` OrphanedMaterialNode, …
nh916 Jun 28, 2023
9a8f968
still getting `CRIPTOrphanedProcessError`
nh916 Jun 28, 2023
5c26354
process integration test successful!
nh916 Jun 28, 2023
0b90d1d
added comment
nh916 Jun 28, 2023
a7f3c03
removed print statement from test
nh916 Jun 28, 2023
4c360ad
fixed OrphanedNodeError
nh916 Jun 28, 2023
bc291a4
added todo
nh916 Jun 28, 2023
2566054
found an issue to fix
nh916 Jun 28, 2023
7c4ada0
adding arguments to complex_condition fixture instantiation
nh916 Jun 29, 2023
12192f1
Merge branch 'develop' into wip_integration_tests
nh916 Jun 29, 2023
3afd052
added `simple_condition_node`
nh916 Jun 29, 2023
4e4f9db
wrote `test_integration_process_condition` but getting `CRIPTJsonDese…
nh916 Jun 29, 2023
cea3f30
wrote `simple_ingredient_node` fixture
nh916 Jun 29, 2023
3674c5f
updated keyword for `simple_ingredient_node` fixture
nh916 Jun 29, 2023
01325ae
`test_integration_material_ingredient` written but getting `bad UUID …
nh916 Jun 29, 2023
8c5bfec
`test_integration_material_ingredient` written but getting `bad UUID …
nh916 Jun 29, 2023
3029305
updated docstring for `test_integration_ingredient`
nh916 Jun 29, 2023
ed3984b
wrote `test_integration_quantity`
nh916 Jun 29, 2023
17e7c57
fixed `simple_software_configuration` fixture
nh916 Jun 29, 2023
ecc7645
`test_integration_software_configuration` successful!
nh916 Jun 29, 2023
eb55358
adding `simple_software_configuration` fixture
nh916 Jun 29, 2023
710061c
adding `simple_software_configuration` fixture to conftest.py
nh916 Jun 29, 2023
778503c
`test_integration_algorithm` successful!
nh916 Jun 29, 2023
88a62e2
added description to `simple_equipment_node` fixture
nh916 Jun 29, 2023
bea38ea
`test_integration_equipment` successful!
nh916 Jun 29, 2023
5598646
`test_integration_parameter` hitting deserialization error
nh916 Jun 29, 2023
5b0991f
moved around the print statements a bit to make it easier to debug
nh916 Jun 29, 2023
4af9759
`test_integration_material_property` successful!
nh916 Jun 29, 2023
2430fda
`test_integration_computational_forcefield` successful!
nh916 Jun 29, 2023
b70962c
wrote `simplest_computational_process_node` fixture
nh916 Jun 29, 2023
8207015
updated `test_integration_computational_process`
nh916 Jun 29, 2023
8f9570e
removed print statement from `test_integration_process_condition`
nh916 Jul 5, 2023
4569a7a
fixed `equipment/test_json`
nh916 Jul 5, 2023
6d4707b
fixed `test_property/test_json`
nh916 Jul 5, 2023
66df963
fixed `test_software_configuration/test_json`
nh916 Jul 5, 2023
7db2f92
switching order of print statement for debugging purposes
nh916 Jul 5, 2023
bccac8b
updated `test_computational_forcefield` and is passing
nh916 Jul 5, 2023
297fb9d
fix condition integration error: the backend was sending str values i…
InnocentBug Jul 6, 2023
ed92e7b
added comment
nh916 Jul 6, 2023
0d51ace
wrote up design for save_helper.py for `Bad UUID` errors
nh916 Jul 6, 2023
04028d3
fix parameter.value type issue with temporary fix
InnocentBug Jul 6, 2023
511b1c2
designed brute_force_save
nh916 Jul 6, 2023
95da76e
broke save into save and send post request
nh916 Jul 6, 2023
715ea6b
Merge remote-tracking branch 'origin/wip_integration_tests' into wip_…
nh916 Jul 6, 2023
9a654af
put `get_bad_uuid_from_error_message` into a helper function
nh916 Jul 6, 2023
72b939b
wrote the loop for `brute_force_save`
nh916 Jul 6, 2023
f04f222
Bad UUID handling (#186)
InnocentBug Jul 7, 2023
819b149
Refactor the save a little bit. Patch does not work.
InnocentBug Jul 7, 2023
6158d83
extent expection to make handling somethings more nicely
InnocentBug Jul 10, 2023
57b8dde
adjust JSON to for patching
InnocentBug Jul 10, 2023
8989bc0
wrote `test_integration_software` for test_software.py successfully!
nh916 Jul 10, 2023
0b186f2
wrote host and token placeholder within conftest.py
nh916 Jul 10, 2023
1d70508
Merge branch 'develop' into wip_integration_tests
nh916 Jul 10, 2023
e7ded96
removed unused variable
nh916 Jul 10, 2023
2fd89e1
fix cspell
InnocentBug Jul 10, 2023
e93321b
Refactor the save a little bit. Patch does not work. (#190)
InnocentBug Jul 10, 2023
22ec46e
fix import
InnocentBug Jul 10, 2023
2aeda2b
Merge branch 'wip_integration_tests' of github.com:C-Accel-CRIPT/Pyth…
InnocentBug Jul 10, 2023
df07120
Merge branch 'wip_integration_tests' into handle-patch
InnocentBug Jul 10, 2023
abf7bde
add some further stuff to make it better readable.
InnocentBug Jul 11, 2023
25f4f66
fix
InnocentBug Jul 11, 2023
c6177a8
Merge branch 'develop' into handle-patch
InnocentBug Jul 11, 2023
4658233
fix import
InnocentBug Jul 11, 2023
7da07f5
fix mypy warning
InnocentBug Jul 11, 2023
f9184aa
convert it to iterative internal save
InnocentBug Jul 11, 2023
576fc0b
fix mypy
InnocentBug Jul 11, 2023
70b0f7f
add regex comments
InnocentBug Jul 11, 2023
2625c64
add comment for error message parsing
InnocentBug Jul 11, 2023
36e0ea9
add comments
InnocentBug Jul 12, 2023
a576f9d
Wrote Integration Tests for Update/PATCH (#197)
nh916 Jul 14, 2023
0c1b26d
work around for non-working GET
InnocentBug Jul 14, 2023
b948021
mypy ignore
InnocentBug Jul 14, 2023
fa30341
fix parameter test
InnocentBug Jul 14, 2023
31b0b20
small update
nh916 Jul 17, 2023
3525599
updated db schema to work with `POST` and `PATCH`
nh916 Jul 20, 2023
ef56054
updated docstrings
nh916 Jul 20, 2023
41c70c0
Merge branch 'develop' into handle-patch
InnocentBug Jul 20, 2023
610030a
fix trunk
InnocentBug Jul 20, 2023
017266e
Merge remote-tracking branch 'origin/update_db_schema_validation' int…
InnocentBug Jul 20, 2023
0661e8a
enable is patch for validation
InnocentBug Jul 20, 2023
c2b38cb
change save validation to respect patch
InnocentBug Jul 20, 2023
e2d26ad
fix project.validate
InnocentBug Jul 20, 2023
4a8fc40
revert to working state
InnocentBug Jul 20, 2023
ffc89e3
commented out test_integration.py for CI
nh916 Jul 20, 2023
5430b24
optimized imports
nh916 Jul 20, 2023
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
108 changes: 76 additions & 32 deletions src/cript/api/api.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@
import uuid
import warnings
from pathlib import Path
from typing import Any, Dict, List, Optional, Set, Union
from typing import Any, Dict, List, Optional, Union

import boto3
import jsonschema
Expand All @@ -21,7 +21,12 @@
)
from cript.api.paginator import Paginator
from cript.api.utils.get_host_token import resolve_host_and_token
from cript.api.utils.save_helper import fix_node_save
from cript.api.utils.save_helper import (
_fix_node_save,
_get_uuid_from_error_message,
_identify_suppress_attributes,
_InternalSaveValues,
)
from cript.api.valid_search_modes import SearchModes
from cript.api.vocabulary_categories import ControlledVocabularyCategories
from cript.nodes.exceptions import CRIPTJsonNodeError, CRIPTNodeSchemaError
Expand Down Expand Up @@ -514,11 +519,14 @@ def save(self, project: Project) -> None:
"""
try:
self._internal_save(project)
except Exception as exc:
# TODO remove all pre-handled nodes.
except CRIPTAPISaveError as exc:
if exc.pre_saved_nodes:
for node_uuid in exc.pre_saved_nodes:
# TODO remove all pre-saved nodes by their uuid.
pass
raise exc from exc

def _internal_save(self, node, known_uuid: Optional[Set[str]] = None) -> Optional[Set[str]]:
def _internal_save(self, node, save_values: Optional[_InternalSaveValues] = None) -> _InternalSaveValues:
"""
Internal helper function that handles the saving of different nodes (not just project).

Expand All @@ -528,42 +536,77 @@ def _internal_save(self, node, known_uuid: Optional[Set[str]] = None) -> Optiona
This works, because we keep track of "Bad UUID" handled nodes, and represent them in the JSON only as the UUID.
"""

# known_uuid are node, that we have saved to the back end before.
# We keep track of it, so that we can condense them to UUID only in the JSON.
if known_uuid is None:
known_uuid = set()
if save_values is None:
save_values = _InternalSaveValues()

node.validate()
# saves all the local files to cloud storage right before saving the Project node
# Ensure that all file nodes have uploaded there payload before actual save.
for file_node in node.find_children({"node": ["File"]}):
file_node.ensure_uploaded(api=self)

# We assemble the JSON to be saved to back end.
# Note how we exclude pre-saved uuid nodes.
json_data = node.get_json(known_uuid=known_uuid).json

# This checks if the current node exists on the back end.
# if it does exist we use `patch` if it doesn't `post`.
node_known = len(self.search(type(node), SearchModes.UUID, str(node.uuid)).current_page_results) == 1
if node_known:
response: Dict = requests.patch(url=f"{self._host}/{node.node_type.lower()}/{str(node.uuid)}", headers=self._http_headers, data=json_data).json()
else:
response: Dict = requests.post(url=f"{self._host}/{node.node_type.lower()}", headers=self._http_headers, data=json_data).json() # type: ignore

# If we get an error we may be able to fix, we to handle this extra and save the bad node first.
# Errors with this code, may be fixable
if response["code"] in (400, 409):
nodes_fixed = fix_node_save(self, node, response, known_uuid)
# In case of a success, we return the know uuid
if nodes_fixed is not False:
return nodes_fixed
# if not successful, we escalate the problem further
# Dummy response to have a virtual do-while loop, instead of while loop.
response = {"code": -1}
InnocentBug marked this conversation as resolved.
Show resolved Hide resolved
# TODO remove once get works properly
force_patch = False

while response["code"] != 200:
# Keep a record of how the state was before the loop
old_save_values = copy.deepcopy(save_values)
# We assemble the JSON to be saved to back end.
# Note how we exclude pre-saved uuid nodes.
json_data = node.get_json(known_uuid=save_values.saved_uuid, suppress_attributes=save_values.suppress_attributes).json

# This checks if the current node exists on the back end.
# if it does exist we use `patch` if it doesn't `post`.
test_get_response: Dict = requests.get(url=f"{self._host}/{node.node_type_snake_case}/{str(node.uuid)}", headers=self._http_headers).json()
patch_request = test_get_response["code"] == 200

# TODO remove once get works properly
if not patch_request and force_patch:
patch_request = True
force_patch = False

# If all that is left is a UUID, we don't need to save it, we can just exit the loop.
if patch_request and len(json.loads(json_data)) == 1:
response = {"code": 200}
break

if patch_request:
response: Dict = requests.patch(url=f"{self._host}/{node.node_type_snake_case}/{str(node.uuid)}", headers=self._http_headers, data=json_data).json() # type: ignore
else:
response: Dict = requests.post(url=f"{self._host}/{node.node_type_snake_case}", headers=self._http_headers, data=json_data).json() # type: ignore

# If we get an error we may be able to fix, we to handle this extra and save the bad node first.
# Errors with this code, may be fixable
if response["code"] in (400, 409):
returned_save_values = _fix_node_save(self, node, response, save_values)
save_values += returned_save_values

# Handle errors from patching with too many attributes
if patch_request and response["code"] in (400,):
suppress_attributes = _identify_suppress_attributes(node, response)
new_save_values = _InternalSaveValues(save_values.saved_uuid, suppress_attributes)
save_values += new_save_values

# It is only worthwhile repeating the attempted save loop if our state has improved.
# Aka we did something to fix the occurring error
if not save_values > old_save_values:
nh916 marked this conversation as resolved.
Show resolved Hide resolved
# TODO remove once get works properly
if not patch_request and response["code"] == 409 and response["error"].strip().startswith("Duplicate uuid:"): # type: ignore
duplicate_uuid = _get_uuid_from_error_message(response["error"]) # type: ignore
if str(node.uuid) == duplicate_uuid:
print("force_patch", node.uuid)
force_patch = True
continue

break

if response["code"] != 200:
raise CRIPTAPISaveError(api_host_domain=self._host, http_code=response["code"], api_response=response["error"])
raise CRIPTAPISaveError(api_host_domain=self._host, http_code=response["code"], api_response=response["error"], patch_request=patch_request, pre_saved_nodes=save_values.saved_uuid, json_data=json_data) # type: ignore

return known_uuid
save_values.saved_uuid.add(str(node.uuid))
return save_values

def upload_file(self, file_path: Union[Path, str]) -> str:
# trunk-ignore-begin(cspell)
Expand Down Expand Up @@ -717,7 +760,8 @@ def search(
"""

# get node typ from class
node_type = node_type.node_type.lower()
node_type = node_type.node_type_snake_case
print(node_type)

# always putting a page parameter of 0 for all search URLs
page_number = 0
Expand Down
14 changes: 11 additions & 3 deletions src/cript/api/exceptions.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
from typing import List
from typing import List, Optional, Set

from cript.exceptions import CRIPTException

Expand Down Expand Up @@ -110,13 +110,21 @@ class CRIPTAPISaveError(CRIPTException):
http_code: str
api_response: str

def __init__(self, api_host_domain: str, http_code: str, api_response: str):
def __init__(self, api_host_domain: str, http_code: str, api_response: str, patch_request: bool, pre_saved_nodes: Optional[Set[str]] = None, json_data: Optional[str] = None):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this exception class will need clean up later

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I recommend passing in the http method as a string or something, we would then manipulate the string to be uppercase so the errors are always uniform

having a bool for it works right now because we only have POST and PATCH, but in the future when we start to have DELETE and PUT this will not be flexible

I think it is fine for now to get the project done, but will need refactor in the future to be more maintainable and more friendly

self.api_host_domain = api_host_domain
self.http_code = http_code
self.api_response = api_response
self.patch_request = patch_request
self.pre_saved_nodes = pre_saved_nodes
self.json_data = json_data

def __str__(self) -> str:
error_message = f"API responded with 'http:{self.http_code} {self.api_response}'"
type = "POST"
if self.patch_request:
type = "PATCH"
error_message = f"API responded to {type} with 'http:{self.http_code} {self.api_response}'"
if self.json_data:
error_message += f" data: {self.json_data}"

return error_message

Expand Down
135 changes: 122 additions & 13 deletions src/cript/api/utils/save_helper.py
Original file line number Diff line number Diff line change
@@ -1,29 +1,114 @@
def fix_node_save(api, node, response, known_uuid):
import json
import re
import uuid
from dataclasses import dataclass, field
from typing import Dict, Set


@dataclass
class _InternalSaveValues:
"""
Class that carries attributes to be carried through recursive calls of _internal_save.
"""

saved_uuid: Set[str] = field(default_factory=set)
suppress_attributes: Dict[str, Set[str]] = field(default_factory=dict)

def __add__(self, other: "_InternalSaveValues") -> "_InternalSaveValues":
"""
Implement a short hand to combine two of these save values, with `+`.
This unions, the `saved_uuid`.
And safely unions `suppress_attributes` too.
"""
# Make a manual copy of `self`.
return_value = _InternalSaveValues(self.saved_uuid.union(other.saved_uuid), self.suppress_attributes)

# Union the dictionary.
for uuid_str in other.suppress_attributes:
try:
# If the uuid exists in both `suppress_attributes` union the value sets
return_value.suppress_attributes[uuid_str] = return_value.suppress_attributes[uuid_str].union(other.suppress_attributes[uuid_str])
except KeyError:
# If it only exists in one, just copy the set into the new one.
return_value.suppress_attributes[uuid_str] = other.suppress_attributes[uuid_str]
return return_value

def __gt__(self, other):
nh916 marked this conversation as resolved.
Show resolved Hide resolved
"""
A greater comparison to see if something was added to the info.
"""
if len(self.saved_uuid) > len(other.saved_uuid):
return True
if len(self.suppress_attributes) > len(other.suppress_attributes):
return True
# If the two dicts have the same key, make sure at least one key has more suppressed attributes
if self.suppress_attributes.keys() == other.suppress_attributes.keys():
longer_set_found = False
for key in other.suppress_attributes:
if len(self.suppress_attributes[key]) < len(other.suppress_attributes[key]):
return False
if self.suppress_attributes[key] > other.suppress_attributes[key]:
longer_set_found = True
return longer_set_found
return False


def _fix_node_save(api, node, response, save_values: _InternalSaveValues) -> _InternalSaveValues:
"""
Helper function, that attempts to fix a bad node.
And if it is fixable, we resave the entire node.

Returns set of known uuids, if fixable, otherwise False.
"""
assert response["code"] in (400, 409)
if response["code"] not in (400, 409):
raise RuntimeError(f"The internal helper function `_fix_node_save` has been called for an error that is not yet implemented to be handled {response}.")

if response["error"].startswith("Bad uuid:") or response["error"].strip().startswith("Duplicate uuid:"):
missing_uuid = get_uuid_from_error_message(response["error"])
missing_uuid = _get_uuid_from_error_message(response["error"])
missing_node = find_node_by_uuid(node, missing_uuid)

# If the missing node, is the same as the one we are trying to save, this not working.
# We end the infinite loop here.
if missing_uuid == str(node.uuid):
return save_values
# Now we save the bad node extra.
# So it will be known when we attempt to save the graph again.
# Since we pre-saved this node, we want it to be UUID edge only the next JSON.
# So we add it to the list of known nodes
known_uuid.union(api._internal_save(missing_node, known_uuid)) # type: ignore
returned_save_values = api._internal_save(missing_node, save_values)
save_values += returned_save_values
# The missing node, is now known to the API
known_uuid.add(missing_uuid)
# Recursive call.
# Since we should have fixed the "Bad UUID" now, we can try to save the node again
return api._internal_save(node, known_uuid)
return False
save_values.saved_uuid.add(missing_uuid)

# Handle all duplicate items warnings if possible
if response["error"].startswith("duplicate item"):
for search_dict_str in re.findall(r"\{(.*?)\}", response["error"]): # Regular expression finds all text elements enclosed in `{}`. In the error message this is the dictionary describing the duplicated item.
# The error message contains a description of the offending elements.
search_dict_str = "{" + search_dict_str + "}"
search_dict_str = search_dict_str.replace("'", '"')
search_dict = json.loads(search_dict_str)
# These are in the exact format to use with `find_children` so we find all the offending children.
all_duplicate_nodes = node.find_children(search_dict)
nh916 marked this conversation as resolved.
Show resolved Hide resolved
for duplicate_node in all_duplicate_nodes:
# Unfortunately, even patch errors if you patch with an offending element.
# So we remove the offending element from the JSON
# TODO IF THIS IS A TRUE DUPLICATE NAME ERROR, IT WILL ERROR AS THE NAME ATTRIBUTE IS MISSING.
try:
# the search_dict convenient list all the attributes that are offending in the keys.
# So if we haven't listed the current node in the suppress attribute dict, we add the node with the offending attributes to suppress.
save_values.suppress_attributes[str(duplicate_node.uuid)] = set(search_dict.keys())
except KeyError:
# If we have the current node in the dict, we just add the new elements to the list of suppressed attributes for it.
save_values.suppress_attributes[str(duplicate_node.uuid)].add(set(search_dict.keys())) # type: ignore
nh916 marked this conversation as resolved.
Show resolved Hide resolved

# Attempts to save the duplicate items element.
save_values += api._internal_save(duplicate_node, save_values)
# After the save, we can reduce it to just a UUID edge in the graph (avoiding the duplicate issues).
save_values.saved_uuid.add(str(duplicate_node.uuid))

return save_values


def get_uuid_from_error_message(error_message: str) -> str:
def _get_uuid_from_error_message(error_message: str) -> str:
InnocentBug marked this conversation as resolved.
Show resolved Hide resolved
"""
takes an CRIPTAPISaveError and tries to get the UUID that the API is having trouble with
and return that
Expand All @@ -37,19 +122,43 @@ def get_uuid_from_error_message(error_message: str) -> str:
UUID
the UUID the API had trouble with
"""
bad_uuid = None
if error_message.startswith("Bad uuid: "):
bad_uuid = error_message[len("Bad uuid: ") : -len(" provided")].strip()
if error_message.strip().startswith("Duplicate uuid:"):
bad_uuid = error_message[len(" Duplicate uuid:") : -len("provided")].strip()
if bad_uuid is None or len(bad_uuid) != len(str(uuid.uuid4())): # Ensure we found a full UUID describing string (here tested against a random new uuid length.)
raise RuntimeError(f"The internal helper function `_get_uuid_from_error_message` has been called for an error message that is not yet implemented to be handled. error message {error_message}, found uuid {bad_uuid}.")
InnocentBug marked this conversation as resolved.
Show resolved Hide resolved

return bad_uuid


def find_node_by_uuid(node, uuid: str):
def find_node_by_uuid(node, uuid_str: str):
# Use the find_children functionality to find that node in our current tree
# We can have multiple occurrences of the node,
# but it doesn't matter which one we save
# TODO some error handling, for the BUG case of not finding the UUID
missing_node = node.find_children({"uuid": uuid})[0]
missing_node = node.find_children({"uuid": uuid_str})[0]

return missing_node


def _identify_suppress_attributes(node, response: Dict) -> Dict[str, Set[str]]:
suppress_attributes: Dict[str, Set[str]] = {}
if response["error"].startswith("Additional properties are not allowed"):
# Find all the attributes, that are listed in the error message with regex
attributes = set(re.findall(r"'(.*?)'", response["error"])) # regex finds all attributes in enclosing `'`. This is how the error message lists them.

# At the end of the error message the offending path is given.
# The structure of the error message is such, that is is after `path:`, so we find and strip the path out of the message.
path = response["error"][response["error"].rfind("path:") + len("path:") :].strip()
nh916 marked this conversation as resolved.
Show resolved Hide resolved

if path != "/":
# TODO find the UUID this belongs to
raise RuntimeError("Fixing non-root objects for patch, not implemented yet. This is a bug, please report it on https://github.com/C-Accel-CRIPT/Python-SDK/ .")

try:
suppress_attributes[str(node.uuid)].add(attributes) # type: ignore
except KeyError:
suppress_attributes[str(node.uuid)] = attributes
return suppress_attributes
Loading