Skip to content

Commit

Permalink
In Proofreading, Load Oversegmentation, Perform Merges Eagerly in Fro…
Browse files Browse the repository at this point in the history
…ntend (#7654)

* add datastore route to list all agglomerate ids

* remove unused file access

* types

* Apply hdf5 mappings in frontend [WIP]

* Fix segment and agglomerate id mixup in proofreading saga

* add route to tracingstore

* edit mapping after merge and set it in redux store

* Fix mapping initialization

* WIP: implement split in frontend

* Only load subset of mapping for segments visible 5s after page load

* small fix

* cleanup, time measurements, throttle mapping requests to 500ms -> currently too laggy

* Update redux-saga to use throttle with effectChannels

* Only refresh mapping if bucket data changed. Fix partial mapping updates after master merge.

* Fix how proofreading actions update the mapping in the frontend

* disable most ci checks

* misc improvements

* refine shouldUseDataStore logic

* fix type error

* fix some type errors

* fix more type errors

* use NumberLike type for number and bigint at various places

* more NumberLike usages

* fix lots of typescript errors (related to bigint and to redux)

* fix more type errors

* fix the last type errors

* restore proper NumberLike definition and fix remaining type errors

* remove unused imports

* fix race condition which could cause mapping to not be properly initialized upon reload

* only pass isCentered instead of centeredSegmentId to SegmentListItems

* fix react warning when closing context menu for the first time

* add debug code for rare scenario where segment id is not an integer

* fix TS error

* fix that hovered unmapped segment id would not update sometimes

* avoid two map look ups

* add dryUpdate step before saving proofreading update actions

* undo provoking the error

* make selective segment visibility in proofreading an option

* fix NaN value after mapping unknown segment key

* avoid parallel executions of updateHdf5Mapping and also cancel updateHdf5Mapping if wk enters a busy state to avoid using outdated values

* always compare known segment ids to newest mapping instead of 'previous' one that isn't updated when doing proofreading operations

* don't crash completely when segment mapping is not known yet and user initiates proofreading action

* remove artificial delays

* improve logging

* fix ts error

* remove debugging code for NaN mapped id

* tweak hovering

* rename MIN_CUT_AGGLOMERATE actions

* adapt cutFromNeighbors to magic mapping approach

* avoid roundtrip for mesh reloading by using newest mapping

* tweak hovering (II)

* extract code into gatherInfoForOperation

* fix context menu for skeletons in 3d viewport

* make agglomerate-skeleton-proofreading compatible with magic mappings

* don't run rendering code for context menu when its not open

* remove unused previousMappingObject

* dynamically switch between local and remotely applied mappings when switching to/from proofreading tool etc. (unfortunately, buggy)

* auto-reload page if dev-proxy fails

* fix missing reload when disabling/re-enabling mapping (now too many reloads are performed)

* extract finishMappingInitialization action

* extract ensureMappingsAreLoadedAndRequestedMappingExists

* rename mappingIsEditable to hasEditableMapping

* introduce BucketRetrievalSource to better cancel/restart bucket watchers and reload the layer when necessary

* introduce clearMappingAction

* make sure that getBucketRetrievalSource doesn't create new instances all the time when multiple volume layers exist

* move cuckoo modules into libs/cuckoo

* use cuckoo hashing for gpu-based mapping instead of binary search (proper uint64 support needs to be tested)

* add missing module

* remove logging

* make use of protobuf shortcut getHasEditableMapping

* Update frontend/javascripts/oxalis/model/sagas/mapping_saga.ts

Co-authored-by: MichaelBuessemeyer <39529669+MichaelBuessemeyer@users.noreply.github.com>

* Update frontend/javascripts/oxalis/model/sagas/proofread_saga.ts

Co-authored-by: MichaelBuessemeyer <39529669+MichaelBuessemeyer@users.noreply.github.com>

* mappingIsEditable -> hasEditableMapping to fix compilation error

* remove unnecessary braces

* incorporate some pr feedback

* more pr feedback

* add comment to cuckoo table 64 bit

* also implement cuckoo table for uint32 keys and values

* fix invalid initialization of mapping texture when mapping is disabled

* fix that mapping was applied twice

* fix uint32 cuckoo implementation (still hardcoded to always use 32 bit)

* use 64 bit look up when necessary in shader

* remove unused mappingSize

* avoid toolbar rerendering

* only write necessary changes to cuckoo table instead of rewriting it from scratch every time

* ensure diffing of previous and new mapping is fast be providing cached diff operation for which the cache is manually set by the mapping saga

* eagerly compute value set when reasonable to avoid clustering of that computation and to improve FPS

* add missing worker

* implement maintenance of value set for entire data cube to avoid recomputation

* Revert maintenance of value set because it didn't help with performance

This reverts commit c7bb508.

* remove todo because it was tried in c7bb508 and reverted afterwards

* remove some time logging

* optimize/combine some set operations (2x as fast now)

* move and rename diffSets function

* further optimization of set operations (in total 4x faster compared to the initial version)

* remove unused code

* remove superfluous parameter

* add renaming todo

* delete create_set_from_array webworker as it didn't show an improved FPS rate (therefore, the overhead of passing the data doesn't seem reasonable)

* delete debugging code from cuckoo modules

* move attemptMappingLookUp glsl code

* add some comments

* reactivate CI checks

* remove unused code

* remove obsolete imports

* simplify editableMapping check

* fix ts problems

* fix cyclic dependencies

* fix 64 bit mapping rendering

* remove UI warnings that the merger mode doesn't support 64 bit, because now it does

* fix linting

* fix proto related tests

* fix more specs

* fix pullqueue spec

* refactor setNumberLike in cuckoo tables and update some todo comments

* remove unused route, change tracingstore mapping route to proto ListOfLong

* toSet

* remove unused getAgglomerateIdForSegmentId

* fix rendering bug outside of viewport on some GPUs

* adapt agglomeratesForSegments route to new protobuf interface

* link 64-bit issue (#6921) in todo comments

* introduce mappingIsPartial uniform

* mention #7895 in todo comment

* remove commented code

* avoid expensive console.log for large mappings

* disable most ci checks

* fix incorrect hash table size and use hashCombine twice to fix poor capacity utilization due to suboptimal hash distribution

* fix cuckootable rehash (did redundant work) and adapt max iterations parameter

* if many inserts are done for the mapping, flush the table at the end

* fix weak hashing for power-of-two table sizes

* refactor diminished hash capacity tweak; clean up and extend tests so that maxing out capacity is tested, too

* clean up reloadData related code in mapping_saga

* fix serializing bigint to protobuf long

* fix toggling of json mappings

* group consecutive actions in action logger middleware; add debug logging for dispatched actions

* fix that forceful disabled tool wasn't re-enabled when possible (e.g., after toggling segmentation opacity)

* mention 64 bit support issue in comment

* fix broken mapping of ids by sorting the input ids for the server

* add comment about sorting

* don't map ids dynamically in segment list view (instead segment items are created with the mapped id if a mapping exists); see #7919 as a follow-up

* test reaching critical capacity and remove todo comment

* remove some todo comments regarding mapId code that might return unmapped ids if the mapping is partial (impact should be low, add comments for it)

* remove more todos and fix toggling of hdf mappings when no volume tracing exists

* cast to number when sorting bigint

* use bigint in proofreading_saga where sensible and cast to number as late as possible (e.g., in action creators, in REST API etc)

* disable more ci stuff

* try to handle most ids as Number and cast to Number only when dealing with mapping object and buckets

* show zoom warning for agglomerate files only when the mapping is applied remotely

* fix that supervoxel highlighting of mesh stays active after leaving proofreading tool (fixes #7867)

* fix that after changing the color of a mesh via the segments tab the mesh is always highlighted after initial hover (fixes #7845)

* remove unused imports

* fix proper type of values returned from getAgglomeratesForSegmentsFrom*

* fix incorrect bigint check and refactor to avoid the same mistake in the future

* fix unnecessary type adaption that failed on null

* fix another sorting bug

* integrate pr feedback

* rename cuckoo table to CuckooTableVec3

* unify 64 bit todo comments

* forceful -> forcefully

* refactor eager value set computation

* use 0s to initialize mapping uniforms

* refactor/fix mapId logic for unmapped ids

* fix selective visibility for alpha != 0.2

* only emit soft errors when a data value could not be mapped

* fix mapping message hiding too early/never; fix disabled message in mapping UI

* misc console stuff

* also skip texture updates for cuckoo table when lots of unsets need to be done

* show short user notification when segmentation layer is reloaded

* highlight whole segment mesh on hover even when geometry is not merged (i.e., super-voxels are highlightable) if not in proofreading tool

* use current mag when reading segment ids in proofreading (unless agglomerate skeletons are used)

* remove last todo comments

* update changelog

* remove console.log

* re-enable ci checks

* fix linting

---------

Co-authored-by: Daniel Werner <daniel.werner@scalableminds.com>
Co-authored-by: Charlie Meister <charlie.meister@student.hpi.de>
Co-authored-by: Philipp Otto <philipp.4096@gmail.com>
Co-authored-by: Philipp Otto <philippotto@users.noreply.github.com>
Co-authored-by: MichaelBuessemeyer <39529669+MichaelBuessemeyer@users.noreply.github.com>
  • Loading branch information
6 people authored Jul 31, 2024
1 parent 44007eb commit ca5f2aa
Show file tree
Hide file tree
Showing 86 changed files with 2,647 additions and 1,033 deletions.
2 changes: 2 additions & 0 deletions CHANGELOG.unreleased.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,10 +20,12 @@ For upgrade instructions, please check the [migration guide](MIGRATIONS.released
- Upgraded backend dependencies for improved performance and stability. [#7922](https://github.com/scalableminds/webknossos/pull/7922)
- It is now saved whether segment groups are collapsed or expanded, so this information doesn't get lost e.g. upon page reload. [#7928](https://github.com/scalableminds/webknossos/pull/7928/)
- The context menu entry "Focus in Segment List" expands all necessary segment groups in the segments tab to show the highlighted segment. [#7950](https://github.com/scalableminds/webknossos/pull/7950)
- In the proofreading mode, you can enable/disable that only the active segment and the hovered segment are rendered. [#7654](https://github.com/scalableminds/webknossos/pull/7654)

### Changed
- The warning about a mismatch between the scale of a pre-computed mesh and the dataset scale's factor now also considers all supported mags of the active segmentation layer. This reduces the false posive rate regarding this warning. [#7921](https://github.com/scalableminds/webknossos/pull/7921/)
- It is no longer allowed to edit annotations of other organizations, even if they are set to public and to others-may-edit. [#7923](https://github.com/scalableminds/webknossos/pull/7923)
- When proofreading segmentations, the user can now interact with super-voxels directly in the data viewports. Additionally, proofreading is significantly faster because the segmentation data doesn't have to be re-downloaded after each merge/split operation. [#7654](https://github.com/scalableminds/webknossos/pull/7654)

### Fixed
- Fixed a bug that allowed the default newly created bounding box to appear outside the dataset. In case the whole bounding box would be outside it is created regardless. [#7892](https://github.com/scalableminds/webknossos/pull/7892)
Expand Down
2 changes: 1 addition & 1 deletion app/models/annotation/nml/NmlWriter.scala
Original file line number Diff line number Diff line change
Expand Up @@ -263,7 +263,7 @@ class NmlWriter @Inject()(implicit ec: ExecutionContext) extends FoxImplicits {
case Right(volumeTracing) =>
volumeTracing.fallbackLayer.foreach(writer.writeAttribute("fallbackLayer", _))
volumeTracing.largestSegmentId.foreach(id => writer.writeAttribute("largestSegmentId", id.toString))
if (!volumeTracing.mappingIsEditable.getOrElse(false)) {
if (!volumeTracing.hasEditableMapping.getOrElse(false)) {
volumeTracing.mappingName.foreach { mappingName =>
writer.writeAttribute("mappingName", mappingName)
}
Expand Down
113 changes: 87 additions & 26 deletions frontend/javascripts/admin/admin_rest_api.ts
Original file line number Diff line number Diff line change
Expand Up @@ -77,13 +77,19 @@ import type {
MappingType,
VolumeTracing,
UserConfiguration,
Mapping,
NumberLike,
} from "oxalis/store";
import type { NewTask, TaskCreationResponseContainer } from "admin/task/task_create_bulk_view";
import type { QueryObject } from "admin/task/task_search_form";
import { V3 } from "libs/mjs";
import type { Versions } from "oxalis/view/version_view";
import { enforceValidatedDatasetViewConfiguration } from "types/schemas/dataset_view_configuration_defaults";
import { parseProtoTracing } from "oxalis/model/helpers/proto_helpers";
import {
parseProtoListOfLong,
parseProtoTracing,
serializeProtoListOfLong,
} from "oxalis/model/helpers/proto_helpers";
import type { RequestOptions } from "libs/request";
import Request from "libs/request";
import type { Message } from "libs/toast";
Expand Down Expand Up @@ -886,7 +892,7 @@ export async function getTracingForAnnotationType(
): Promise<ServerTracing> {
const { tracingId, typ } = annotationLayerDescriptor;
const version = extractVersion(versions, tracingId, typ);
const tracingType = typ.toLowerCase();
const tracingType = typ.toLowerCase() as "skeleton" | "volume";
const possibleVersionString = version != null ? `&version=${version}` : "";
const tracingArrayBuffer = await doWithToken((token) =>
Request.receiveArraybuffer(
Expand Down Expand Up @@ -1599,23 +1605,6 @@ export function getEditableMappingInfo(
);
}

export function getAgglomerateIdForSegmentId(
tracingStoreUrl: string,
tracingId: string,
segmentId: number,
): Promise<number> {
return doWithToken(async (token) => {
const urlParams = new URLSearchParams({
token,
segmentId: `${segmentId}`,
});
const { agglomerateId } = await Request.receiveJSON(
`${tracingStoreUrl}/tracings/mapping/${tracingId}/agglomerateIdForSegmentId?${urlParams.toString()}`,
);
return agglomerateId;
});
}

export function getPositionForSegmentInAgglomerate(
datastoreUrl: string,
datasetId: APIDatasetId,
Expand Down Expand Up @@ -2068,6 +2057,67 @@ export function getAgglomerateSkeleton(
);
}

export async function getAgglomeratesForSegmentsFromDatastore<T extends number | bigint>(
dataStoreUrl: string,
datasetId: APIDatasetId,
layerName: string,
mappingId: string,
segmentIds: Array<T>,
): Promise<Mapping> {
const segmentIdBuffer = serializeProtoListOfLong<T>(segmentIds);
const listArrayBuffer: ArrayBuffer = await doWithToken((token) =>
Request.receiveArraybuffer(
`${dataStoreUrl}/data/datasets/${datasetId.owningOrganization}/${datasetId.name}/layers/${layerName}/agglomerates/${mappingId}/agglomeratesForSegments?token=${token}`,
{
method: "POST",
body: segmentIdBuffer,
headers: {
"Content-Type": "application/octet-stream",
},
},
),
);
// Ensure that the values are bigint if the keys are bigint
const adaptToType = Utils.isBigInt(segmentIds[0])
? (el: NumberLike) => BigInt(el)
: (el: NumberLike) => el;
const keyValues = _.zip(segmentIds, parseProtoListOfLong(listArrayBuffer).map(adaptToType));
// @ts-ignore
return new Map(keyValues);
}

export async function getAgglomeratesForSegmentsFromTracingstore<T extends number | bigint>(
tracingStoreUrl: string,
tracingId: string,
segmentIds: Array<T>,
): Promise<Mapping> {
const segmentIdBuffer = serializeProtoListOfLong<T>(
// The tracing store expects the ids to be sorted
segmentIds.sort(<T extends NumberLike>(a: T, b: T) => Number(a - b)),
);
const listArrayBuffer: ArrayBuffer = await doWithToken((token) =>
Request.receiveArraybuffer(
`${tracingStoreUrl}/tracings/mapping/${tracingId}/agglomeratesForSegments?token=${token}`,
{
method: "POST",
body: segmentIdBuffer,
headers: {
"Content-Type": "application/octet-stream",
},
},
),
);

// Ensure that the values are bigint if the keys are bigint
const adaptToType = Utils.isBigInt(segmentIds[0])
? (el: NumberLike) => BigInt(el)
: (el: NumberLike) => el;

const keyValues = _.zip(segmentIds, parseProtoListOfLong(listArrayBuffer).map(adaptToType));
// @ts-ignore
return new Map(keyValues);
}

export function getEditableAgglomerateSkeleton(
tracingStoreUrl: string,
tracingId: string,
Expand Down Expand Up @@ -2228,18 +2278,24 @@ export async function getEdgesForAgglomerateMinCut(
tracingStoreUrl: string,
tracingId: string,
segmentsInfo: {
segmentId1: number;
segmentId2: number;
segmentId1: NumberLike;
segmentId2: NumberLike;
mag: Vector3;
agglomerateId: number;
agglomerateId: NumberLike;
editableMappingId: string;
},
): Promise<Array<MinCutTargetEdge>> {
return doWithToken((token) =>
Request.sendJSONReceiveJSON(
`${tracingStoreUrl}/tracings/volume/${tracingId}/agglomerateGraphMinCut?token=${token}`,
{
data: segmentsInfo,
data: {
...segmentsInfo,
// TODO: Proper 64 bit support (#6921)
segmentId1: Number(segmentsInfo.segmentId1),
segmentId2: Number(segmentsInfo.segmentId2),
agglomerateId: Number(segmentsInfo.agglomerateId),
},
},
),
);
Expand All @@ -2254,17 +2310,22 @@ export async function getNeighborsForAgglomerateNode(
tracingStoreUrl: string,
tracingId: string,
segmentInfo: {
segmentId: number;
segmentId: NumberLike;
mag: Vector3;
agglomerateId: number;
agglomerateId: NumberLike;
editableMappingId: string;
},
): Promise<NeighborInfo> {
return doWithToken((token) =>
Request.sendJSONReceiveJSON(
`${tracingStoreUrl}/tracings/volume/${tracingId}/agglomerateGraphNeighbors?token=${token}`,
{
data: segmentInfo,
data: {
...segmentInfo,
// TODO: Proper 64 bit support (#6921)
segmentId: Number(segmentInfo.segmentId),
agglomerateId: Number(segmentInfo.agglomerateId),
},
},
),
);
Expand Down
7 changes: 5 additions & 2 deletions frontend/javascripts/libs/async/debounced_abortable_saga.ts
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,9 @@ import { call, type Saga } from "oxalis/model/sagas/effect-generators";
import { buffers, Channel, channel, runSaga } from "redux-saga";
import { delay, race, take } from "redux-saga/effects";

// biome-ignore lint/complexity/noBannedTypes: This is copied from redux-saga because it cannot be imported.
type NotUndefined = {} | null;

/*
* This function takes a saga and a debounce threshold
* and returns a function F that will trigger the given saga
Expand All @@ -15,7 +18,7 @@ import { delay, race, take } from "redux-saga/effects";
* is slower than a standard _.debounce. Also see
* debounced_abortable_saga.spec.ts for a small benchmark.
*/
export function createDebouncedAbortableCallable<T, C>(
export function createDebouncedAbortableCallable<T extends NotUndefined, C>(
fn: (param1: T) => Saga<void>,
debounceThreshold: number,
context: C,
Expand Down Expand Up @@ -56,7 +59,7 @@ export function createDebouncedAbortableParameterlessCallable<C>(
};
}

function* debouncedAbortableSagaRunner<T, C>(
function* debouncedAbortableSagaRunner<T extends NotUndefined, C>(
debounceThreshold: number,
triggerChannel: Channel<T>,
abortableFn: (param: T) => Saga<void>,
Expand Down
Loading

0 comments on commit ca5f2aa

Please sign in to comment.