Skip to content

Latest commit

 

History

History
82 lines (60 loc) · 2.71 KB

README.md

File metadata and controls

82 lines (60 loc) · 2.71 KB

Collective Knowledge repository for collaboratively benchmarking and optimising embedded deep vision runtime library for Jetson TX1

All CK components can be found at cKnowledge.io and in one GitHub repository!

This project is hosted by the cTuning foundation.

compatibility License

Introduction

CK-TensorRT is an open framework for collaborative and reproducible optimisation of convolutional neural networks for Jetson TX1 based on the Collective Knowledge framework. It's based on the Deep Inference framework from Dustin Franklin (a Jetson developer @ NVIDIA). In essence, CK-TensorRT is simply a suite of convenient wrappers with unified JSON API for customizable building, evaluating and multi-objective optimisation of Jetson Inference runtime library for Jetson TX1.

Authors/contributors

Quick installation on Ubuntu

TBD

Installing general dependencies

$ sudo apt install coreutils \
                   build-essential \
                   make \
                   cmake \
                   wget \
                   git \
                   python \
                   python-pip

Installing CK-TensorRT dependencies

$ sudo apt install libqt4-dev \
                   libglew-dev \
                   libgstreamer1.0-dev

Installing CK

$ sudo pip install ck
$ ck version

Installing CK-TensorRT repository

$ ck pull repo:ck-tensorrt

Building CK-TensorRT and all dependencies via CK

The first time you run a TensorRT program (e.g. tensorrt-test), CK will build and install all missing dependencies on your machine, download the required data sets and start the benchmark:

$ ck run program:tensorrt-test

Related projects and initiatives

We are working with the community to unify and crowdsource performance analysis and tuning of various DNN frameworks (or any realistic workload) using the Collective Knowledge Technology: