-
Notifications
You must be signed in to change notification settings - Fork 168
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: adding json writing, reading infrastructure #2283
feat: adding json writing, reading infrastructure #2283
Conversation
📊 Physics performance monitoring for a68ef55Summary VertexingSeedingCKFAmbiguity resolutionTruth tracking (Kalman Filter)Truth tracking (GSF) |
How does polymorphism prevent us from using |
Codecov Report
@@ Coverage Diff @@
## main #2283 +/- ##
==========================================
- Coverage 49.33% 49.30% -0.03%
==========================================
Files 447 447
Lines 25149 25162 +13
Branches 11571 11577 +6
==========================================
Hits 12407 12407
- Misses 4542 4555 +13
Partials 8200 8200
📣 We’re building smart automated test selection to slash your CI/CD build times. Learn more |
I encountered two problems:
If you look through the |
I think the idea of
In case of polymorphism you either need to know what type you are parsing or you need a base class parser that can deduce the actual type. But maybe I did not understand the application in this case. |
I don't know. Ultimately, |
Certainly worth the discussion! If we manage the use the native 'to_json' / 'from_json' design - I am certainly up for it. Let me try to summarise what is needed:
So, my design for this was so far: Writing: Reading: For the reading, e.g. association objects could be the already created volumes, such that the links to those can be restored. How this is done in the detector reading, is that e.g. all volumes are constructed first, then all portals will be constructed and the list of volumes is provided at portal construction. If I understand that correctly I would do a tuple object and define read/write for that ? |
The bigger problem is actually that we can not serialise everything on an object level, the Think of our Detector/TrackingGeometry concept: volumes can share portal surfaces which then point to volumes again. My custom toJson/fromJson methods allows me to do that in a clever way, i.e. I can resuse and optimise information, e.g. I can write a single portal container to the output json and then, when the actual portals of the volumes are written, I only write the index into these containers to the volumes (this requires the volume to be written in the context of a detector setup). |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
to be continued
Co-authored-by: Joana Niermann <53186085+niermann999@users.noreply.github.com>
Co-authored-by: Joana Niermann <53186085+niermann999@users.noreply.github.com>
Co-authored-by: Joana Niermann <53186085+niermann999@users.noreply.github.com>
…er.hpp Co-authored-by: Joana Niermann <53186085+niermann999@users.noreply.github.com>
Co-authored-by: Joana Niermann <53186085+niermann999@users.noreply.github.com>
Co-authored-by: Joana Niermann <53186085+niermann999@users.noreply.github.com>
Co-authored-by: Joana Niermann <53186085+niermann999@users.noreply.github.com>
Co-authored-by: Joana Niermann <53186085+niermann999@users.noreply.github.com>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I am wondering if the detray implementation should go into its own file, at least for the tests, so that detray conversion bugs don't fail the general tests?
Tests/UnitTests/Plugins/Json/DetectorVolumeJsonConverterTests.cpp
Outdated
Show resolved
Hide resolved
Co-authored-by: Joana Niermann <53186085+niermann999@users.noreply.github.com>
@niermann999 - all comments addressed I suppose. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Otherwise, I think this can go forward
This PR adds the reading/writing infrastructure of the new `Experimental` detector feature to and from json. At the same time it streamlines how we use the `json::nlohmann` module: - As we need polymorphism, the intrinsic `to_json` and `from_json` nomenclature that would allow auto-translation is practically unusable for the `Detector`. Hence it is replaced by a consistent `ObjectJsonConverter::toJson` and `ObjectJsonConverter::fromJson` naming scheme. - It changes enum types from hand-written string writing to the `nlohmann::enum` handling macro consistently - It introduces nested `Options` struct for future refinement of json writing - It adds a dedicated `detray` writing mode for conversion into detray formal --------- Co-authored-by: Joana Niermann <53186085+niermann999@users.noreply.github.com>
This PR adds the reading/writing infrastructure of the new
Experimental
detector feature to and from json.At the same time it streamlines how we use the
json::nlohmann
module:to_json
andfrom_json
nomenclature that would allow auto-translation is practically unusable for theDetector
. Hence it is replaced by a consistentObjectJsonConverter::toJson
andObjectJsonConverter::fromJson
naming scheme.nlohmann::enum
handling macro consistentlyOptions
struct for future refinement of json writingdetray
writing mode for conversion into detray formal