Skip to content

Commit

Permalink
Update documentation (#62)
Browse files Browse the repository at this point in the history
  • Loading branch information
ClementAlba authored Feb 3, 2023
1 parent 13d4723 commit 836c7ee
Show file tree
Hide file tree
Showing 5 changed files with 94 additions and 51 deletions.
5 changes: 0 additions & 5 deletions .readthedocs.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -22,8 +22,3 @@ sphinx:
# If using Sphinx, optionally build your docs in additional formats such as PDF
# formats:
# - pdf

# Optionally declare the Python requirements required to build your docs
python:
install:
- requirements: requirements.txt
10 changes: 5 additions & 5 deletions docs/source/index.rst → docs/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -6,8 +6,8 @@ Some processing on point clouds can be very time consuming, this problem can be
.. toctree::
:maxdepth: 2

intro
installation
usage
api
cli
source/intro
source/installation
source/usage
source/api
source/cli
71 changes: 56 additions & 15 deletions docs/source/api.rst
Original file line number Diff line number Diff line change
Expand Up @@ -18,25 +18,66 @@ There is an exemple of the usage of the API:
There is only one function in pdal-parallelizer:

.. code-block:: python
process(config, input_type, timeout, n_workers, threads_per_worker, dry_run, diagnostic, tile_size, buffer, remove_buffer, bounding_box, process)
Process points clouds.

**Parameters:**
Parameters:
...........

**config (*str*)**

Path of your config file.

**input_type (*str*)**

This parameter indicates whether you are processing a single file or a list of files. It can take only two values: "single" or "dir". If single, please change the input filed of the config file to put the path of your file instead of the path of your input directory.

**timeout (*int*, *optional*)**

Time before a worker is killed for inactivity. If you do not specify a timeout, you will need to specify it before the start of the run on the command line.

**n_workers (*int*, *optional*)**

Number of cores you want for processing. (default=3)

**threads_per_worker (*int*, *optional*)**

Number of threads for each worker. (default=1)

**dry_run (*int*, *optional*)**

Number of files to execute the test.

**diagnostic (*bool*, *optional*)**

Get a graph of the memory usage during the execution. (default=False)

Parameters related to single file processing:
.............................................

**tile_size (*tuple*, *optional*)**

Size of the tiles. (default=(256,256))

**buffer (*int*, *optional*)**

Size of the buffer that will be applied to the tiles. (in all 4 directions)

**remove_buffer (*bool*, *optional*)**

If True, it indicate you want to remove the buffer when your tiles are written. If you choose not to delete the buffer, it will be assigned the withheld flag. (default=False)

**bounding_box (*tuple*, *optional*)**

Coordinates of the bounding box you want to process. (minx, miny, maxx, maxy)

**merge_tiles (*bool*, *optional*)**

- config (*str*) - Path of your config file.
- input_type (*str*) - This parameter indicates whether you are processing a single file or a list of files. It can take only two values: "single" or "dir". If single, please change the input filed of the config file to put the path of your file instead of the path of your input directory.
- timeout (*int*, *optional*) - Time before a worker is killed for inactivity. If you do not specify a timeout, you will need to specify it before the start of the run on the command line.
- n_workers (*int*, *optional*) - Number of cores you want for processing. (default=3)
- threads_per_worker (*int*, *optional*) - Number of threads for each worker. (default=1)
- dry_run (*int*, *optional*) - Number of files to execute the test.
- diagnostic (*bool*, *optional*) - Get a graph of the memory usage during the execution. (default=False)
Indicate you want to merge all the tiles at the end of a single cloud treatment. (default=False)

**The following parameters are specific to the processing of a single file:**
**remove_tiles (*bool*, *optional*)**

- tile_size (*tuple*, *optional*) - Size of the tiles. (default=(256,256))
- buffer (*int*, *optional*) - Size of the buffer that will be applied to the tiles. (in all 4 directions)
- remove_buffer (*bool*, *optional*) - If True, it indicate you want to remove the buffer when your tiles are written. If you choose not to delete the buffer, it will be assigned the withheld flag.
- bounding_box (*tuple*, *optional*) - Coordinates of the bounding box you want to process. (minx, miny, maxx, maxy)
- merge_tiles (*bool*, *optional*) - This flag indicate you want to merge all the tiles at the end of a single cloud treatment.
- process (*bool*, *optional*) - If True, you will see a progress bar while the clouds are processed. (default=False)
If you choose to merge the tiles, set remove_tiles to True to remove the merged tiles. (default=False)
57 changes: 32 additions & 25 deletions docs/source/cli.rst
Original file line number Diff line number Diff line change
Expand Up @@ -20,53 +20,60 @@ Here, you just have to enter the desired timeout and press enter, the the treatm
pdal-parallelizer process-pipelines CONFIG INPUT_TYPE [OPTIONS]
**Required arguments**
Required arguments
..................

-c, --config
**-c, --config**

Path of your config file.
Path of your config file.

-it, --input_type
**-it, --input_type**

This option indicates whether you are processing a single file or a list of files : single or dir. If single, please change the input filed of the config file to put the path of your file instead of the path of your input directory.
This option indicates whether you are processing a single file or a list of files : single or dir. If single, please change the input filed of the config file to put the path of your file instead of the path of your input directory.

**Options**
Options
.......

-nw, --n_workers
**-nw, --n_workers**

Number of cores you want for processing [default=3]
Number of cores you want for processing (default=3)

-tpw, --threads_per_worker
**-tpw, --threads_per_worker**

Number of threads for each worker [default=1]
Number of threads for each worker [default=1]

-dr, --dry_run
**-dr, --dry_run**

Number of files to execute the test [default=None]
Number of files to execute the test [default=None]

-d, --diagnostic
**-d, --diagnostic**

Get a graph of the memory usage during the execution
Get a graph of the memory usage during the execution

**Some options are only related to single file processing**
Options related to single file processing
.........................................

-ts, --tile_size
**-ts, --tile_size**

Size of the tiles [default=(256,256)]
Size of the tiles (default=(256,256))

-b, --buffer
**-b, --buffer**

Size of the buffer that will be applied to the tiles (in all 4 directions)
Size of the buffer that will be applied to the tiles (in all 4 directions)


-rb, --remove_buffer
**-rb, --remove_buffer**

This flag indicate you want to remove the buffer when your tiles are written. If you choose not to delete the buffer, it will be assigned the withheld flag.
This flag indicate you want to remove the buffer when your tiles are written. If you choose not to delete the buffer, it will be assigned the withheld flag.

-bb, --bounding_box
**-bb, --bounding_box**

Coordinates of the bounding box you want to process (minx miny maxx maxy)
Coordinates of the bounding box you want to process (minx miny maxx maxy)

-mt, --merge_tiles
**-mt, --merge_tiles**

This flag indicate you want to merge all the tiles at the end of a single cloud treatment
This flag indicate you want to merge all the tiles at the end of a single cloud treatment

**-rt, --remove_tiles**

If you choose to merge the tiles, put this flag to remove the merged tiles.
2 changes: 1 addition & 1 deletion docs/source/usage.rst
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ This file look like this :
{
"input": "The folder that contains your input files (or a file path)",
"output": "The folder that will receive your output files",
"temp": "The folder that will contains your serialized pipelines"
"temp": "The folder that will contains your serialized pipelines",
"pipeline": "Your pipeline path"
}
Expand Down

0 comments on commit 836c7ee

Please sign in to comment.