Official pytorch implementation for PSUMNet for skeleton action recognition. Accepted at ECCV 2022 WCPA
PSUMNet introduces unified modality part-based streaming approach compared to the conventional modality wise streaming approaches. This novel approach allows PSUMNet to achieve state of the art performance across skeleton action recognition datasets compared to competing methods which use around 100-400% more parameters.
The following table compares the perfromance of PSUMNet with other existing methods on Cross Subject splits of NTU60, NTU120, NTU60-X and NTU120-X dataasets.
Model | # Params (M) | FLOPs (G) | NTU60 | NTU120 | NTU60-X | NTU120-X |
---|---|---|---|---|---|---|
PA-ResGCN | 3.6 | 18.5 | 90.9 | 87.3 | 91.6 | 86.4 |
MS-G3D | 6.4 | 48.5 | 91.5 | 86.9 | 91.8 | 87.1 |
4s ShiftGCN | 2.8 | 10.0 | 90.7 | 85.9 | 91.8 | 86.2 |
DSTA-Net | 14.0 | 64.7 | 91.5 | 86.6 | 93.6 | 87.8 |
CTR-GCN | 5.6 | 7.6 | 92.4 | 88.9 | 93.9 | 88.3 |
PSUMNet | 2.8 | 2.7 | 92.9 | 89.4 | 94.7 | 89.1 |
Results on SHREC3d hand gesture dataset
Model | # Params (M) | 14 Gestures | 28 Gestures |
---|---|---|---|
Key-Frame CNN | 7.9 | 82.9 | 71.9 |
CNN+LSTM | 8.0 | 89.8 | 86.3 |
Parallel CNN | 13.8 | 91.3 | 84.4 |
STA-Res TCN | 6.0 | 93.6 | 90.7 |
DDNet | 1.8 | 94.6 | 91.9 |
DSTANet | 14.0 | 97.0 | 93.9 |
PSUMNet | 0.9 | 95.5 | 93.1 |
Major requirements are as following,
- Python >= 3.6
- Pytorch >= 1.1.0
Rest of the requirements are specified in requirements.txt
and can be installed using pip install -r requirements.txt
.
We trained our model using 4 1080Ti GPUs with 12 Gb RAM each.
We report results on the following datasets.
The specific steps of processing these datasets are exmplained in the next sections.
- Request dataset here: https://rose1.ntu.edu.sg/dataset/actionRecognition
- Download the skeleton-only datasets:
nturgbd_skeletons_s001_to_s017.zip
(NTU RGB+D 60)nturgbd_skeletons_s018_to_s032.zip
(NTU RGB+D 120)- Extract above files to
./data/nturgbd_raw
We use the same data preprocessing for NTU60 and NTU120 datasets as given by CTR-GCN. Following are the steps to generate the NTU data from
- Generate NTU RGB+D 60 or NTU RGB+D 120 dataset:
cd ./data/ntu # or cd ./data/ntu120
# Get skeleton of each performer
python get_raw_skes_data.py
# Remove the bad skeleton
python get_raw_denoised_data.py
# Transform the skeleton to the center of the first frame
python seq_transformation.py
These extended versions of original NTURGB+d datasets are introduced here. Once these datasets are downloaded specify their path in the corresponding config files ./config/ntux_configs/. No specific preprocessing is needed to train PSUMNet on NTU60x and NTU120x.
- Download the SHREC data from here
- Generate the train/test splits with
python data/shrec/gendata.py
This preprocessing is adopted from DSTANet
We also provide already preprocessed data which can be found at ./data/shrec/preprocessed_shrec
./config/
contains the configuration files for all the part based streams (i.e body.yaml, hand.yaml, leg.yaml) for NTU kinect and NTU-X dataset
Once the data and the config files are set, the model training can be started using the following command,
python main.py --config <path of the config file>
To use the model for inference using pre trained weights, use the following command,
python main.py --config <path of config file> --phase test --weights <path of pre trained weights>
Once the model is trained for all the part streams and thier scores are saved in .pkl files, you can use ensemble.py file to compute final accuracy by specifying path of these score files.
To achieve the numbers mentioned in the paper, use the streams wise pickle files from here. The scores of each stream are weighted via a hyper paprameter alpha which we have decided experimentally. The alpha values corresponding to different dataset splits is mentioned in the comments of ensemble.py. After adding the paths to scores pickle files and adjusting the alpha, run the following command to compute the final accuracy,
python ensemble.py
Pretrained weights and scores for NTU 60/120 Cross Subject split can be found here for body stream. Scores files for all three streams for NTU60/120 and NTU60x/120x can also be found at this link.
The provied score files along with the alpha values specified in ./ensemble.py can be used to acquire the results as mentioned in the paper.
This work is inspired from CTR-GCN. We thank authors of this repo for their valuable contribution.