Skip to content

Commit

Permalink
README update (#8)
Browse files Browse the repository at this point in the history
* update readme

* fix import statement

Co-authored-by: milos-colic <75583254+milos-colic@users.noreply.github.com>
  • Loading branch information
nkvuong and milos-colic authored Jun 13, 2022
1 parent 561d03d commit b968b14
Showing 1 changed file with 12 additions and 8 deletions.
20 changes: 12 additions & 8 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,8 +4,8 @@
___

[![DBR](https://img.shields.io/badge/DBR-10.4_ML-green)]()
[![PyTest](https://github.com/databricks/arcuate/actions/workflows/pytest.yml/badge.svg?branch=main)](https://github.com/databricks/arcuate/actions/workflows/pytest.yml)
[![Build arcuate project](https://github.com/databricks/arcuate/actions/workflows/build.yml/badge.svg?branch=main)](https://github.com/databricks/arcuate/actions/workflows/build.yml)
[![PyTest](https://github.com/databrickslabs/arcuate/actions/workflows/pytest.yml/badge.svg?branch=main)](https://github.com/databrickslabs/arcuate/actions/workflows/pytest.yml)
[![Build arcuate project](https://github.com/databrickslabs/arcuate/actions/workflows/build.yml/badge.svg?branch=main)](https://github.com/databrickslabs/arcuate/actions/workflows/build.yml)

## Model exchange via Delta Sharing

Expand All @@ -31,26 +31,30 @@ The project name takes inspiration from arcuate delta - the wide fan-shaped rive
- Train model in Databricks (or elsewhere), store it in MLflow
- Export MLflow experiments & models to a Delta table and add it to a share, using Python APIs
```python
import arcuate
from arcuate import *
client = MlflowClient()
spark = SparkSession.builder.getOrCreate()

# export the experiment experiment_name to table_name, and add it to share_name
export_experiments(experiment_name, table_name, share_name)
provider.export_experiments(client, spark, experiment_name, table_name, share_name)

# export the model model_name to table_name, and add it to share_name
export_models(model_name, table_name, share_name)
provider.export_models(client, spark, model_name, table_name, share_name)
```

- Recipient of this shared table can load it into MLflow seamlessly:
```python
import arcuate
from arcuate import *
import delta_sharing

client = MlflowClient()
spark = SparkSession.builder.getOrCreate()
df = delta_sharing.load_as_pandas(delta_sharing_coordinate)

# import the shared table as experiment_name
import_experiments(df, experiment_name)
recipient.import_experiments(client, df, experiment_name)
# or import the model_name
import_models(df, model_name)
recipient.import_models(client, df, model_name)
```

## Project support
Expand Down

0 comments on commit b968b14

Please sign in to comment.