From 6865de2fe2f8e59dacb6d37a52da4b4d07fb8049 Mon Sep 17 00:00:00 2001 From: NANAE AUBRY Date: Wed, 23 Oct 2024 12:14:10 +0200 Subject: [PATCH 1/3] use folders add instead content --- .../updating_features_in_a_feature_layer.ipynb | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/samples/05_content_publishers/updating_features_in_a_feature_layer.ipynb b/samples/05_content_publishers/updating_features_in_a_feature_layer.ipynb index 62828eeebb..4c38e0617a 100644 --- a/samples/05_content_publishers/updating_features_in_a_feature_layer.ipynb +++ b/samples/05_content_publishers/updating_features_in_a_feature_layer.ipynb @@ -1 +1 @@ -{"cells": [{"cell_type": "markdown", "metadata": {}, "source": ["# Updating features in a feature layer\n", "\n", "As content publishers, you may be required to keep certain web layers up to date. As new data arrives, you may have to append new features, update existing features etc. There are a couple of different options to accomplish this:\n", " \n", " - Method 1: editing individual features as updated datasets are available\n", " - Method 2: overwriting feature layers altogether with updated datasets\n", " \n", "Depending on the number of features that are updated, your workflow requirements, you may adopt either or both kinds of update mechanisms.\n", "\n", "In this sample, we explore the first method. For **Method 2**, refer to the sample titled [Overwriting feature layers](https://developers.arcgis.com/python/sample-notebooks/overwriting-feature-layers)\n", "\n", "**Note**: To run this sample, you need the ``pandas`` library in your conda environment. If you don't have the library, install it by running the following command from cmd.exe or your shell\n", "```\n", "conda install pandas```"]}, {"cell_type": "markdown", "metadata": {"toc": true}, "source": ["

Table of Contents

\n", "
"]}, {"cell_type": "code", "execution_count": 3, "metadata": {}, "outputs": [], "source": ["# Connect to the GIS\n", "from arcgis.gis import GIS\n", "from arcgis import features\n", "import pandas as pd"]}, {"cell_type": "code", "execution_count": 4, "metadata": {}, "outputs": [{"name": "stdout", "output_type": "stream", "text": ["Enter password: \u00b7\u00b7\u00b7\u00b7\u00b7\u00b7\u00b7\u00b7\n"]}], "source": ["#Access the portal using \"amazing_arcgis_123\" as password for the given Username. \n", "gis = GIS(\"https://pythonapi.playground.esri.com/portal\")"]}, {"cell_type": "markdown", "metadata": {}, "source": ["## Updating feature layer by editing individual features\n", "\n", "Let us consider a scenario where we need to update a feature layer containing the capital cities of the US. We have 3 csv datasets simulating an update workflow as described below:\n", "\n", " 1. capitals_1.csv -- contains the initial, incomplete dataset\n", " 2. capitals_2.csv -- contains additional points and updates to existing points, building on top of capitals_1.csv\n", " 3. capitals_annex.csv -- an alternate table containing additional attribute information\n", " \n", "Our goal is to update the feature layer with each of these datasets doing the necessary edit operations."]}, {"cell_type": "markdown", "metadata": {}, "source": ["### Publish the cities feature layer using the initial dataset"]}, {"cell_type": "code", "execution_count": 5, "metadata": {}, "outputs": [{"data": {"text/html": ["
\n", "\n", "\n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", "
city_idnamestatecapitalpop2000pop2007longitudelatitude
01HonoluluHIState371657378587-157.82343621.305782
12JuneauAKState3071131592-134.51158258.351418
23Boise CityIDState185787203529-116.23765543.613736
34OlympiaWAState2751445523-122.89307347.042418
45SalemORState136924152039-123.02915544.931109
\n", "
"], "text/plain": [" city_id name state capital pop2000 pop2007 longitude latitude\n", "0 1 Honolulu HI State 371657 378587 -157.823436 21.305782\n", "1 2 Juneau AK State 30711 31592 -134.511582 58.351418\n", "2 3 Boise City ID State 185787 203529 -116.237655 43.613736\n", "3 4 Olympia WA State 27514 45523 -122.893073 47.042418\n", "4 5 Salem OR State 136924 152039 -123.029155 44.931109"]}, "execution_count": 5, "metadata": {}, "output_type": "execute_result"}], "source": ["# read the initial csv\n", "csv1 = 'data/updating_gis_content/capitals_1.csv'\n", "cities_df_1 = pd.read_csv(csv1)\n", "cities_df_1.head()"]}, {"cell_type": "code", "execution_count": 6, "metadata": {}, "outputs": [{"data": {"text/plain": ["(19, 8)"]}, "execution_count": 6, "metadata": {}, "output_type": "execute_result"}], "source": ["# print the number of records in this csv\n", "cities_df_1.shape"]}, {"cell_type": "markdown", "metadata": {}, "source": ["As you can see, this dataset only contains 19 rows or 19 capital cities. It is not the complete dataset."]}, {"cell_type": "markdown", "metadata": {}, "source": ["Let's add this `csv` as a portal item. Adding the item creates a CSV item and uploads the original file to the portal, establishing a link between the item and the original file name. Therefore, we need a unique name for the file to guarantee it does not collide with any file of the same name that may have been uploaded by the same user. We'll use standard library modules to copy the file and give it a new name so we can add it to the portal"]}, {"cell_type": "code", "execution_count": 7, "metadata": {}, "outputs": [], "source": ["import os\n", "import datetime as dt\n", "import shutil\n", "\n", "# assign variables to locations on the file system \n", "cwd = os.path.abspath(os.getcwd())\n", "data_pth = os.path.join(cwd, r'data/updating_gis_content/')\n", "\n", "# create a unique timestamp string to append to the file name\n", "now_ts = str(int(dt.datetime.now().timestamp()))\n", "\n", "# copy the file, appending the unique string and assign it to a variable\n", "my_csv = shutil.copyfile(os.path.abspath(csv1),\n", " os.path.join(data_pth, 'capitals_1_' + now_ts + '.csv'))"]}, {"cell_type": "code", "execution_count": 8, "metadata": {}, "outputs": [{"data": {"text/plain": ["'\\\\path\\\\on\\\\computer\\\\05_content_publishers\\\\data/updating_gis_content/capitals_1_1548875507.csv'"]}, "execution_count": 8, "metadata": {}, "output_type": "execute_result"}], "source": ["my_csv"]}, {"cell_type": "code", "execution_count": 9, "metadata": {}, "outputs": [{"data": {"text/html": ["
\n", "
\n", " \n", " \n", " \n", "
\n", "\n", "
\n", " USA Capitals spreadsheet 1548875507\n", " \n", "
CSV by arcgis_python\n", "
Last Modified: January 30, 2019\n", "
0 comments, 0 views\n", "
\n", "
\n", " "], "text/plain": [""]}, "execution_count": 9, "metadata": {}, "output_type": "execute_result"}], "source": ["# add the initial csv file and publish that as a web layer\n", "item_prop = {'title':'USA Capitals spreadsheet ' + now_ts}\n", "csv_item = gis.content.add(item_properties=item_prop, data=my_csv)\n", "csv_item"]}, {"cell_type": "markdown", "metadata": {}, "source": ["This spreadsheet has co-ordinates as `latitude` and `longitude` columns which will be used for geometries during publishing."]}, {"cell_type": "code", "execution_count": 10, "metadata": {}, "outputs": [{"data": {"text/html": ["
\n", "
\n", " \n", " \n", " \n", "
\n", "\n", "
\n", " USA Capitals spreadsheet 1548875507\n", " \n", "
Feature Layer Collection by arcgis_python\n", "
Last Modified: January 30, 2019\n", "
0 comments, 0 views\n", "
\n", "
\n", " "], "text/plain": [""]}, "execution_count": 10, "metadata": {}, "output_type": "execute_result"}], "source": ["# publish the csv item into a feature layer\n", "cities_item = csv_item.publish()\n", "cities_item"]}, {"cell_type": "code", "execution_count": 19, "metadata": {}, "outputs": [{"data": {"text/html": ["
\n", "
\n", " \n", " \n", " \n", "
\n", "\n", "
\n", " USA Capitals\n", " \n", "
Feature Layer Collection by arcgis_python\n", "
Last Modified: January 30, 2019\n", "
0 comments, 0 views\n", "
\n", "
\n", " "], "text/plain": [""]}, "execution_count": 19, "metadata": {}, "output_type": "execute_result"}], "source": ["# update the item metadata\n", "item_prop = {'title':'USA Capitals'}\n", "cities_item.update(item_properties = item_prop, thumbnail='data/updating_gis_content/capital_cities.png')\n", "cities_item"]}, {"cell_type": "markdown", "metadata": {}, "source": ["### Apply updates from the second spreadsheet\n", "The next set of updates have arrived and are stored in `capitals_2.csv`. We are told it contains corrections for the original set in addition to new features. We need to figure out which rows have changed, apply `update` operation on those, then apply `add` operation to new rows.\n", "\n", "To start with, let us read the second csv file. Note, in this sample, data is stored in csv. In reality, it could be from your enterprise database or any other data source."]}, {"cell_type": "code", "execution_count": 11, "metadata": {}, "outputs": [{"data": {"text/html": ["
\n", "\n", "\n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", "
city_idnamestatecapitalpop2000pop2007longitudelatitude
020Baton RougeLAState227818228810-91.14022730.458091
121HelenaMTState2578026007-112.02702746.595809
222BismarckNDState5553259344-100.77900046.813346
323PierreSDState1387614169-100.33638244.367964
424St. PaulMNState287151291643-93.11411844.954364
\n", "
"], "text/plain": [" city_id name state capital pop2000 pop2007 longitude latitude\n", "0 20 Baton Rouge LA State 227818 228810 -91.140227 30.458091\n", "1 21 Helena MT State 25780 26007 -112.027027 46.595809\n", "2 22 Bismarck ND State 55532 59344 -100.779000 46.813346\n", "3 23 Pierre SD State 13876 14169 -100.336382 44.367964\n", "4 24 St. Paul MN State 287151 291643 -93.114118 44.954364"]}, "execution_count": 11, "metadata": {}, "output_type": "execute_result"}], "source": ["# read the second csv set\n", "csv2 = 'data/updating_gis_content/capitals_2.csv'\n", "cities_df_2 = pd.read_csv(csv2)\n", "cities_df_2.head()"]}, {"cell_type": "code", "execution_count": 12, "metadata": {}, "outputs": [{"data": {"text/plain": ["(36, 8)"]}, "execution_count": 12, "metadata": {}, "output_type": "execute_result"}], "source": ["# get the dimensions of this csv\n", "cities_df_2.shape"]}, {"cell_type": "markdown", "metadata": {}, "source": ["#### Identifying existing features that need to be updated\n", "\n", "To identify features that need to be updated, let us read the attribute table of the published feature layer and compare that against the second csv. To read the attribute table, we perform a `query()` on the feature layer which returns us an `arcgis.feature.FeatureSet` object. Refer to the guide pages on [accessing features from feature layers](https://developers.arcgis.com/python/guide/working-with-feature-layers-and-features/) to learn more about this.\n", "\n", "Note, at this point, we could work with the `cities_df_1` dataframe we created from the original csv file. However, in practice you may not always have the original dataset or your feature layer might have undergone edits after it was published. Hence, we query the feature layer directly."]}, {"cell_type": "code", "execution_count": 13, "metadata": {}, "outputs": [{"data": {"text/html": ["
\n", "\n", "\n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", "
SHAPEcapitalcity_idlatitudelongitudenameobjectidpop2000pop2007state
0{\"x\": -17568824.553, \"y\": 2428377.352700006, \"...State121.305782-157.823436Honolulu1371657378587HI
1{\"x\": -14973760.769500002, \"y\": 8041504.674200...State258.351418-134.511582Juneau23071131592AK
2{\"x\": -12939516.521100001, \"y\": 5405860.248099...State343.613736-116.237655Boise City3185787203529ID
3{\"x\": -13680394.263900002, \"y\": 5949000.547900...State447.042418-122.893073Olympia42751445523WA
4{\"x\": -13695542.842799995, \"y\": 5610682.544100...State544.931109-123.029155Salem5136924152039OR
\n", "
"], "text/plain": [" SHAPE capital city_id \\\n", "0 {\"x\": -17568824.553, \"y\": 2428377.352700006, \"... State 1 \n", "1 {\"x\": -14973760.769500002, \"y\": 8041504.674200... State 2 \n", "2 {\"x\": -12939516.521100001, \"y\": 5405860.248099... State 3 \n", "3 {\"x\": -13680394.263900002, \"y\": 5949000.547900... State 4 \n", "4 {\"x\": -13695542.842799995, \"y\": 5610682.544100... State 5 \n", "\n", " latitude longitude name objectid pop2000 pop2007 state \n", "0 21.305782 -157.823436 Honolulu 1 371657 378587 HI \n", "1 58.351418 -134.511582 Juneau 2 30711 31592 AK \n", "2 43.613736 -116.237655 Boise City 3 185787 203529 ID \n", "3 47.042418 -122.893073 Olympia 4 27514 45523 WA \n", "4 44.931109 -123.029155 Salem 5 136924 152039 OR "]}, "execution_count": 13, "metadata": {}, "output_type": "execute_result"}], "source": ["cities_flayer = cities_item.layers[0]\n", "cities_fset = cities_flayer.query() #querying without any conditions returns all the features\n", "cities_fset.sdf.head()"]}, {"cell_type": "markdown", "metadata": {}, "source": ["The `city_id` column is common between both the datasets. Next, let us perform an `inner` join with the table from feature layer as left and updated csv as right. Inner joins will yield those rows that are present in both tables. Learn more about [inner joins here](https://www.w3schools.com/sql/sql_join_inner.asp)."]}, {"cell_type": "code", "execution_count": 14, "metadata": {}, "outputs": [{"data": {"text/html": ["
\n", "\n", "\n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", "
SHAPEcapital_xcity_idlatitude_xlongitude_xname_xobjectidpop2000_xpop2007_xstate_xname_ystate_ycapital_ypop2000_ypop2007_ylongitude_ylatitude_y
0{'x': -13680394.263900002, 'y': 5949000.547900...State447.042418-122.893073Olympia42751445523WAOlympiaWAState4251445523-122.89307347.042418
1{'x': 11666510.350300007, 'y': -5033833.302499...State10-41.145545104.802046Cheyenne105301154750WYCheyenneWYState5301154750-104.80204641.145545
2{'x': -10857545.543799996, 'y': 4229619.674200...State1335.482309-97.534991Oklahoma City13506132552556OKKOklahoma CityOKState506132552556-97.53499135.482309
3{'x': -10421826.864700003, 'y': 5099899.263700...State1641.590936-93.620864Des Moines16200682201257IADes MoinesIAState198682201257-93.62086441.590936
\n", "
"], "text/plain": [" SHAPE capital_x city_id \\\n", "0 {'x': -13680394.263900002, 'y': 5949000.547900... State 4 \n", "1 {'x': 11666510.350300007, 'y': -5033833.302499... State 10 \n", "2 {'x': -10857545.543799996, 'y': 4229619.674200... State 13 \n", "3 {'x': -10421826.864700003, 'y': 5099899.263700... State 16 \n", "\n", " latitude_x longitude_x name_x objectid pop2000_x pop2007_x \\\n", "0 47.042418 -122.893073 Olympia 4 27514 45523 \n", "1 -41.145545 104.802046 Cheyenne 10 53011 54750 \n", "2 35.482309 -97.534991 Oklahoma City 13 506132 552556 \n", "3 41.590936 -93.620864 Des Moines 16 200682 201257 \n", "\n", " state_x name_y state_y capital_y pop2000_y pop2007_y longitude_y \\\n", "0 WA Olympia WA State 42514 45523 -122.893073 \n", "1 WY Cheyenne WY State 53011 54750 -104.802046 \n", "2 OKK Oklahoma City OK State 506132 552556 -97.534991 \n", "3 IA Des Moines IA State 198682 201257 -93.620864 \n", "\n", " latitude_y \n", "0 47.042418 \n", "1 41.145545 \n", "2 35.482309 \n", "3 41.590936 "]}, "execution_count": 14, "metadata": {}, "output_type": "execute_result"}], "source": ["overlap_rows = pd.merge(left = cities_fset.sdf, right = cities_df_2, how='inner',\n", " on = 'city_id')\n", "overlap_rows"]}, {"cell_type": "markdown", "metadata": {}, "source": ["Thus, of 19 features in original and 36 features in second csv, 4 features are common. Inspecting the table, we find certain columns are updated, for instance, Cheyenne has its coordinates corrected, Oklahoma City has its state abbreviation corrected and similarly other cities have one of their attribute columns updated.\n", "\n", "We could either update individual attribute values for these 4 features or update all attribute values with the latest csv. Below, we are performing the latter as it is simple and fast.\n", "\n", "#### Perform updates to the existing features"]}, {"cell_type": "code", "execution_count": 15, "metadata": {}, "outputs": [], "source": ["features_for_update = [] #list containing corrected features\n", "all_features = cities_fset.features"]}, {"cell_type": "code", "execution_count": 16, "metadata": {}, "outputs": [{"data": {"text/plain": ["{\"geometry\": {\"x\": -17568824.553, \"y\": 2428377.352700006, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}, \"attributes\": {\"objectid\": 1, \"city_id\": 1, \"name\": \"Honolulu\", \"state\": \"HI\", \"capital\": \"State\", \"pop2000\": 371657, \"pop2007\": 378587, \"longitude\": -157.8234362, \"latitude\": 21.30578163, \"SHAPE\": {\"x\": -17568824.553, \"y\": 2428377.352700006, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}}}"]}, "execution_count": 16, "metadata": {}, "output_type": "execute_result"}], "source": ["# inspect one of the features\n", "all_features[0]"]}, {"cell_type": "markdown", "metadata": {}, "source": ["Note the X and Y geometry values are different from decimal degree coordinates present in Longitude and Latitude fields. To perform geometry edits, we need to project the coordinates to match that of the feature layer."]}, {"cell_type": "code", "execution_count": 17, "metadata": {}, "outputs": [{"data": {"text/plain": ["{'wkid': 102100, 'latestWkid': 3857}"]}, "execution_count": 17, "metadata": {}, "output_type": "execute_result"}], "source": ["# get the spatial reference of the features since we need to update the geometry\n", "cities_fset.spatial_reference"]}, {"cell_type": "markdown", "metadata": {}, "source": ["Below, we prepare updated geometries and attributes for each of the 4 features we determined above. We use the `arcgis.geometry` module to `project` the coordinates from geographic to projected coordinate system. The cell below prints the original `Feature` objects followed by the updated ones. If you look closely, you can find the differences."]}, {"cell_type": "code", "execution_count": 18, "metadata": {}, "outputs": [{"name": "stdout", "output_type": "stream", "text": ["{\"geometry\": {\"x\": -13680394.263900002, \"y\": 5949000.547900006, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}, \"attributes\": {\"objectid\": 4, \"city_id\": 4, \"name\": \"Olympia\", \"state\": \"WA\", \"capital\": \"State\", \"pop2000\": 27514, \"pop2007\": 45523, \"longitude\": -122.8930726, \"latitude\": 47.04241817, \"SHAPE\": {\"x\": -13680394.263900002, \"y\": 5949000.547900006, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}}}\n", "{\"geometry\": {\"x\": -13680394.2638528, \"y\": 5949000.54792491}, \"attributes\": {\"objectid\": 4, \"city_id\": 4, \"name\": \"Olympia\", \"state\": \"WA\", \"capital\": \"State\", \"pop2000\": 42514, \"pop2007\": 45523, \"longitude\": -122.8930726, \"latitude\": 47.04241817, \"SHAPE\": {\"x\": -13680394.263900002, \"y\": 5949000.547900006, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}}}\n", "========================================================================\n", "{\"geometry\": {\"x\": 11666510.350300007, \"y\": -5033833.302499998, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}, \"attributes\": {\"objectid\": 10, \"city_id\": 10, \"name\": \"Cheyenne\", \"state\": \"WY\", \"capital\": \"State\", \"pop2000\": 53011, \"pop2007\": 54750, \"longitude\": 104.8020456, \"latitude\": -41.14554516, \"SHAPE\": {\"x\": 11666510.350300007, \"y\": -5033833.302499998, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}}}\n", "{\"geometry\": {\"x\": -11666510.350285435, \"y\": 5033833.302497153}, \"attributes\": {\"objectid\": 10, \"city_id\": 10, \"name\": \"Cheyenne\", \"state\": \"WY\", \"capital\": \"State\", \"pop2000\": 53011, \"pop2007\": 54750, \"longitude\": -104.80204559999999, \"latitude\": 41.14554516, \"SHAPE\": {\"x\": 11666510.350300007, \"y\": -5033833.302499998, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}}}\n", "========================================================================\n", "{\"geometry\": {\"x\": -10857545.543799996, \"y\": 4229619.674200006, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}, \"attributes\": {\"objectid\": 13, \"city_id\": 13, \"name\": \"Oklahoma City\", \"state\": \"OKK\", \"capital\": \"State\", \"pop2000\": 506132, \"pop2007\": 552556, \"longitude\": -97.5349911, \"latitude\": 35.48230867, \"SHAPE\": {\"x\": -10857545.543799996, \"y\": 4229619.674200006, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}}}\n", "{\"geometry\": {\"x\": -10857545.54377847, \"y\": 4229619.674165817}, \"attributes\": {\"objectid\": 13, \"city_id\": 13, \"name\": \"Oklahoma City\", \"state\": \"OK\", \"capital\": \"State\", \"pop2000\": 506132, \"pop2007\": 552556, \"longitude\": -97.5349911, \"latitude\": 35.48230867, \"SHAPE\": {\"x\": -10857545.543799996, \"y\": 4229619.674200006, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}}}\n", "========================================================================\n", "{\"geometry\": {\"x\": -10421826.864700003, \"y\": 5099899.263700008, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}, \"attributes\": {\"objectid\": 16, \"city_id\": 16, \"name\": \"Des Moines\", \"state\": \"IA\", \"capital\": \"State\", \"pop2000\": 200682, \"pop2007\": 201257, \"longitude\": -93.62086361, \"latitude\": 41.59093617, \"SHAPE\": {\"x\": -10421826.864700003, \"y\": 5099899.263700008, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}}}\n", "{\"geometry\": {\"x\": -10421826.864691716, \"y\": 5099899.263692743}, \"attributes\": {\"objectid\": 16, \"city_id\": 16, \"name\": \"Des Moines\", \"state\": \"IA\", \"capital\": \"State\", \"pop2000\": 198682, \"pop2007\": 201257, \"longitude\": -93.62086361, \"latitude\": 41.59093617, \"SHAPE\": {\"x\": -10421826.864700003, \"y\": 5099899.263700008, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}}}\n", "========================================================================\n"]}], "source": ["from arcgis import geometry #use geometry module to project Long,Lat to X and Y\n", "from copy import deepcopy\n", "\n", "for city_id in overlap_rows['city_id']:\n", " # get the feature to be updated\n", " original_feature = [f for f in all_features if f.attributes['city_id'] == city_id][0]\n", " feature_to_be_updated = deepcopy(original_feature)\n", " \n", " print(str(original_feature))\n", " \n", " # get the matching row from csv\n", " matching_row = cities_df_2.where(cities_df_2.city_id == city_id).dropna()\n", " \n", " #get geometries in the destination coordinate system\n", " input_geometry = {'y':float(matching_row['latitude']),\n", " 'x':float(matching_row['longitude'])}\n", " output_geometry = geometry.project(geometries = [input_geometry],\n", " in_sr = 4326, \n", " out_sr = cities_fset.spatial_reference['latestWkid'],\n", " gis = gis)\n", " \n", " # assign the updated values\n", " feature_to_be_updated.geometry = output_geometry[0]\n", " feature_to_be_updated.attributes['longitude'] = float(matching_row['longitude'])\n", " feature_to_be_updated.attributes['city_id'] = int(matching_row['city_id'])\n", " feature_to_be_updated.attributes['state'] = matching_row['state'].values[0]\n", " feature_to_be_updated.attributes['capital'] = matching_row['capital'].values[0]\n", " feature_to_be_updated.attributes['latitude'] = float(matching_row['latitude'])\n", " feature_to_be_updated.attributes['name'] = matching_row['name'].values[0]\n", " feature_to_be_updated.attributes['pop2000'] = int(matching_row['pop2000'])\n", " feature_to_be_updated.attributes['pop2007'] = int(matching_row['pop2007'])\n", " \n", " #add this to the list of features to be updated\n", " features_for_update.append(feature_to_be_updated)\n", " \n", " print(str(feature_to_be_updated))\n", " print(\"========================================================================\")"]}, {"cell_type": "markdown", "metadata": {}, "source": ["We have constructed a list of features with updated values. We can use this list to perform updates on the feature layer."]}, {"cell_type": "code", "execution_count": 19, "metadata": {}, "outputs": [{"data": {"text/plain": ["[{\"geometry\": {\"x\": -13680394.2638528, \"y\": 5949000.54792491}, \"attributes\": {\"objectid\": 4, \"city_id\": 4, \"name\": \"Olympia\", \"state\": \"WA\", \"capital\": \"State\", \"pop2000\": 42514, \"pop2007\": 45523, \"longitude\": -122.8930726, \"latitude\": 47.04241817, \"SHAPE\": {\"x\": -13680394.263900002, \"y\": 5949000.547900006, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}}},\n", " {\"geometry\": {\"x\": -11666510.350285435, \"y\": 5033833.302497153}, \"attributes\": {\"objectid\": 10, \"city_id\": 10, \"name\": \"Cheyenne\", \"state\": \"WY\", \"capital\": \"State\", \"pop2000\": 53011, \"pop2007\": 54750, \"longitude\": -104.80204559999999, \"latitude\": 41.14554516, \"SHAPE\": {\"x\": 11666510.350300007, \"y\": -5033833.302499998, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}}},\n", " {\"geometry\": {\"x\": -10857545.54377847, \"y\": 4229619.674165817}, \"attributes\": {\"objectid\": 13, \"city_id\": 13, \"name\": \"Oklahoma City\", \"state\": \"OK\", \"capital\": \"State\", \"pop2000\": 506132, \"pop2007\": 552556, \"longitude\": -97.5349911, \"latitude\": 35.48230867, \"SHAPE\": {\"x\": -10857545.543799996, \"y\": 4229619.674200006, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}}},\n", " {\"geometry\": {\"x\": -10421826.864691716, \"y\": 5099899.263692743}, \"attributes\": {\"objectid\": 16, \"city_id\": 16, \"name\": \"Des Moines\", \"state\": \"IA\", \"capital\": \"State\", \"pop2000\": 198682, \"pop2007\": 201257, \"longitude\": -93.62086361, \"latitude\": 41.59093617, \"SHAPE\": {\"x\": -10421826.864700003, \"y\": 5099899.263700008, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}}}]"]}, "execution_count": 19, "metadata": {}, "output_type": "execute_result"}], "source": ["features_for_update"]}, {"cell_type": "markdown", "metadata": {}, "source": ["To update the feature layer, call the `edit_features()` method of the `FeatureLayer` object and pass the list of features to the `updates` parameter:"]}, {"cell_type": "code", "execution_count": 20, "metadata": {}, "outputs": [{"data": {"text/plain": ["{'addResults': [],\n", " 'updateResults': [{'objectId': 4, 'globalId': None, 'success': True},\n", " {'objectId': 10, 'globalId': None, 'success': True},\n", " {'objectId': 13, 'globalId': None, 'success': True},\n", " {'objectId': 16, 'globalId': None, 'success': True}],\n", " 'deleteResults': [],\n", " 'attachments': {'addResults': [], 'updateResults': [], 'deleteResults': []}}"]}, "execution_count": 20, "metadata": {}, "output_type": "execute_result"}], "source": ["cities_flayer.edit_features(updates= features_for_update)"]}, {"cell_type": "markdown", "metadata": {}, "source": ["We have successfully applied corrections to those features which existed in the feature layer from the initial dataset. Next let us proceed to adding new features present only in the second csv file.\n", "\n", "#### Identifying new features that need to be added"]}, {"cell_type": "code", "execution_count": 21, "metadata": {}, "outputs": [{"name": "stdout", "output_type": "stream", "text": ["(32, 8)\n"]}], "source": ["#select those rows in the capitals_2.csv that do not overlap with those in capitals_1.csv\n", "new_rows = cities_df_2[~cities_df_2['city_id'].isin(overlap_rows['city_id'])]\n", "print(new_rows.shape)"]}, {"cell_type": "code", "execution_count": 22, "metadata": {}, "outputs": [{"data": {"text/html": ["
\n", "\n", "\n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", "
city_idnamestatecapitalpop2000pop2007longitudelatitude
020Baton RougeLAState227818228810-91.14022730.458091
121HelenaMTState2578026007-112.02702746.595809
222BismarckNDState5553259344-100.77900046.813346
323PierreSDState1387614169-100.33638244.367964
424St. PaulMNState287151291643-93.11411844.954364
\n", "
"], "text/plain": [" city_id name state capital pop2000 pop2007 longitude latitude\n", "0 20 Baton Rouge LA State 227818 228810 -91.140227 30.458091\n", "1 21 Helena MT State 25780 26007 -112.027027 46.595809\n", "2 22 Bismarck ND State 55532 59344 -100.779000 46.813346\n", "3 23 Pierre SD State 13876 14169 -100.336382 44.367964\n", "4 24 St. Paul MN State 287151 291643 -93.114118 44.954364"]}, "execution_count": 22, "metadata": {}, "output_type": "execute_result"}], "source": ["new_rows.head()"]}, {"cell_type": "markdown", "metadata": {}, "source": ["Thus, of the total 36 rows in the second csv, we have determined the 32 other rows which are new and need to be appended as new features."]}, {"cell_type": "markdown", "metadata": {}, "source": ["#### Adding new features\n", "Next, let us compose another `list` of `Feature` objects similar to earlier, from the `new_rows` data frame."]}, {"cell_type": "code", "execution_count": 23, "metadata": {}, "outputs": [{"name": "stdout", "output_type": "stream", "text": ["Creating Baton Rouge\n", "Creating Helena\n", "Creating Bismarck\n", "Creating Pierre\n", "Creating St. Paul\n", "Creating Madison\n", "Creating Lansing\n", "Creating Augusta\n", "Creating Montpelier\n", "Creating Albany\n", "Creating Boston\n", "Creating Concord\n", "Creating Tallahassee\n", "Creating Jackson\n", "Creating Nashville\n", "Creating Montgomery\n", "Creating Springfield\n", "Creating Indianapolis\n", "Creating Frankfort\n", "Creating Columbus\n", "Creating Charleston\n", "Creating Atlanta\n", "Creating Columbia\n", "Creating Raleigh\n", "Creating Richmond\n", "Creating Harrisburg\n", "Creating Trenton\n", "Creating Dover\n", "Creating Washington\n", "Creating Annapolis\n", "Creating Hartford\n", "Creating Providence\n"]}], "source": ["features_to_be_added = []\n", "\n", "# get a template feature object\n", "template_feature = deepcopy(features_for_update[0])\n", "\n", "# loop through each row and add to the list of features to be added\n", "for row in new_rows.iterrows():\n", " new_feature = deepcopy(template_feature)\n", " \n", " #print\n", " print(\"Creating \" + row[1]['name'])\n", " \n", " #get geometries in the destination coordinate system\n", " input_geometry = {'y':float(row[1]['latitude']),\n", " 'x':float(row[1]['longitude'])}\n", " output_geometry = geometry.project(geometries = [input_geometry],\n", " in_sr = 4326, \n", " out_sr = cities_fset.spatial_reference['latestWkid'],\n", " gis = gis)\n", " \n", " # assign the updated values\n", " new_feature.geometry = output_geometry[0]\n", " new_feature.attributes['longitude'] = float(row[1]['longitude'])\n", " new_feature.attributes['city_id'] = int(row[1]['city_id'])\n", " new_feature.attributes['state'] = row[1]['state']\n", " new_feature.attributes['capital'] = row[1]['capital']\n", " new_feature.attributes['latitude'] = float(row[1]['latitude'])\n", " new_feature.attributes['name'] = row[1]['name']\n", " new_feature.attributes['pop2000'] = int(row[1]['pop2000'])\n", " new_feature.attributes['pop2007'] = int(row[1]['pop2007'])\n", " \n", " #add this to the list of features to be updated\n", " features_to_be_added.append(new_feature)"]}, {"cell_type": "code", "execution_count": 24, "metadata": {}, "outputs": [{"data": {"text/plain": ["{\"geometry\": {\"x\": -10145683.671555312, \"y\": 3562570.141302621}, \"attributes\": {\"objectid\": 4, \"city_id\": 20, \"name\": \"Baton Rouge\", \"state\": \"LA\", \"capital\": \"State\", \"pop2000\": 227818, \"pop2007\": 228810, \"longitude\": -91.14022709999999, \"latitude\": 30.45809113, \"SHAPE\": {\"x\": -13680394.263900002, \"y\": 5949000.547900006, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}}}"]}, "execution_count": 24, "metadata": {}, "output_type": "execute_result"}], "source": ["# take a look at one of the features we created\n", "features_to_be_added[0]"]}, {"cell_type": "markdown", "metadata": {}, "source": ["Thus, we have created a `list` of `Feature` objects with appropriate attributes and geometries. Next, to add these new features to the feature layer, call the `edit_features()` method of the `FeatureLayer` object and pass the list of `Feature` objects to the `adds` parameter:"]}, {"cell_type": "code", "execution_count": 25, "metadata": {}, "outputs": [{"data": {"text/plain": ["{'addResults': [{'objectId': 20, 'globalId': None, 'success': True},\n", " {'objectId': 21, 'globalId': None, 'success': True},\n", " {'objectId': 22, 'globalId': None, 'success': True},\n", " {'objectId': 23, 'globalId': None, 'success': True},\n", " {'objectId': 24, 'globalId': None, 'success': True},\n", " {'objectId': 25, 'globalId': None, 'success': True},\n", " {'objectId': 26, 'globalId': None, 'success': True},\n", " {'objectId': 27, 'globalId': None, 'success': True},\n", " {'objectId': 28, 'globalId': None, 'success': True},\n", " {'objectId': 29, 'globalId': None, 'success': True},\n", " {'objectId': 30, 'globalId': None, 'success': True},\n", " {'objectId': 31, 'globalId': None, 'success': True},\n", " {'objectId': 32, 'globalId': None, 'success': True},\n", " {'objectId': 33, 'globalId': None, 'success': True},\n", " {'objectId': 34, 'globalId': None, 'success': True},\n", " {'objectId': 35, 'globalId': None, 'success': True},\n", " {'objectId': 36, 'globalId': None, 'success': True},\n", " {'objectId': 37, 'globalId': None, 'success': True},\n", " {'objectId': 38, 'globalId': None, 'success': True},\n", " {'objectId': 39, 'globalId': None, 'success': True},\n", " {'objectId': 40, 'globalId': None, 'success': True},\n", " {'objectId': 41, 'globalId': None, 'success': True},\n", " {'objectId': 42, 'globalId': None, 'success': True},\n", " {'objectId': 43, 'globalId': None, 'success': True},\n", " {'objectId': 44, 'globalId': None, 'success': True},\n", " {'objectId': 45, 'globalId': None, 'success': True},\n", " {'objectId': 46, 'globalId': None, 'success': True},\n", " {'objectId': 47, 'globalId': None, 'success': True},\n", " {'objectId': 48, 'globalId': None, 'success': True},\n", " {'objectId': 49, 'globalId': None, 'success': True},\n", " {'objectId': 50, 'globalId': None, 'success': True},\n", " {'objectId': 51, 'globalId': None, 'success': True}],\n", " 'updateResults': [],\n", " 'deleteResults': [],\n", " 'attachments': {'addResults': [], 'updateResults': [], 'deleteResults': []}}"]}, "execution_count": 25, "metadata": {}, "output_type": "execute_result"}], "source": ["cities_flayer.edit_features(adds = features_to_be_added)"]}, {"cell_type": "markdown", "metadata": {}, "source": ["Thus, we have successfully applied edits from second csv file. Next let us look at how we can apply edits from third csv file.\n", "\n", "### Apply edits from third spreadsheet\n", "The next set of updates have arrived and are stored in `capitals_annex.csv`. We are told it contains additional columns for each of the features that we want to add to the feature layer.\n", "\n", "To start with, let us read the third csv file. Note in this sample, data is stored in csv. In reality, it could be from your enterprise database or any other data source."]}, {"cell_type": "code", "execution_count": 26, "metadata": {}, "outputs": [{"data": {"text/html": ["
\n", "\n", "\n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", "
city_idnameclasswhiteblackameri_esasianhawn_plhispanicmalesfemales
01HonoluluCensus Designated Place7309360386892075882545716229182628189029
12JuneauCity and Borough229692483496143811610401546915242
23Boise CityCity17120414371300387030284109201493773
34OlympiaCity36246805553247312518632031922195
45SalemCity113746175020643304643199736875268172
\n", "
"], "text/plain": [" city_id name class white black ameri_es \\\n", "0 1 Honolulu Census Designated Place 73093 6038 689 \n", "1 2 Juneau City and Borough 22969 248 3496 \n", "2 3 Boise City City 171204 1437 1300 \n", "3 4 Olympia City 36246 805 553 \n", "4 5 Salem City 113746 1750 2064 \n", "\n", " asian hawn_pl hispanic males females \n", "0 207588 25457 16229 182628 189029 \n", "1 1438 116 1040 15469 15242 \n", "2 3870 302 8410 92014 93773 \n", "3 2473 125 1863 20319 22195 \n", "4 3304 643 19973 68752 68172 "]}, "execution_count": 26, "metadata": {}, "output_type": "execute_result"}], "source": ["# read the third csv set\n", "csv3 = 'data/updating_gis_content/capitals_annex.csv'\n", "cities_df_3 = pd.read_csv(csv3)\n", "cities_df_3.head()"]}, {"cell_type": "code", "execution_count": 27, "metadata": {}, "outputs": [{"data": {"text/plain": ["(51, 11)"]}, "execution_count": 27, "metadata": {}, "output_type": "execute_result"}], "source": ["#find the number of rows in the third csv\n", "cities_df_3.shape"]}, {"cell_type": "markdown", "metadata": {}, "source": ["The `capitals_annex.csv` does not add new features, instead it adds additional attribute columns to existing features. It has 51 rows which were found to match the 19 + 32 rows from first and second csv files. The columns `City_ID` and `NAME` are common to all 3 spreadsheets. Next let us take a look at how we can append this additional attribute information to our feature layer.\n", "\n", "#### Inspecting existing fields of the feature layer\n", "The `manager` property of the `FeatureLayer` object exposes a set of methods to read and update the properties and definition of feature layers."]}, {"cell_type": "code", "execution_count": 28, "metadata": {}, "outputs": [{"data": {"text/plain": ["{\n", " \"name\": \"city_id\",\n", " \"type\": \"esriFieldTypeInteger\",\n", " \"alias\": \"city_id\",\n", " \"domain\": null,\n", " \"editable\": true,\n", " \"nullable\": true,\n", " \"sqlType\": \"sqlTypeInteger\",\n", " \"length\": 10\n", "}"]}, "execution_count": 28, "metadata": {}, "output_type": "execute_result"}], "source": ["#Get the existing list of fields on the cities feature layer\n", "cities_fields = cities_flayer.manager.properties.fields\n", "\n", "# Your feature layer may have multiple fields, \n", "# instead of printing all, let us take a look at one of the fields:\n", "cities_fields[1]"]}, {"cell_type": "markdown", "metadata": {}, "source": ["From above, we can see the representation of one of the fields. Let us loop through each of the fields and print the `name`, `alias`, `type` and `sqlType` properties"]}, {"cell_type": "code", "execution_count": 29, "metadata": {}, "outputs": [{"name": "stdout", "output_type": "stream", "text": ["objectid | objectid | esriFieldTypeOID | sqlTypeInteger\n", "city_id | city_id | esriFieldTypeInteger | sqlTypeInteger\n", "name | name | esriFieldTypeString | sqlTypeVarchar\n", "state | state | esriFieldTypeString | sqlTypeVarchar\n", "capital | capital | esriFieldTypeString | sqlTypeVarchar\n", "pop2000 | pop2000 | esriFieldTypeInteger | sqlTypeInteger\n", "pop2007 | pop2007 | esriFieldTypeInteger | sqlTypeInteger\n", "longitude | longitude | esriFieldTypeDouble | sqlTypeNumeric\n", "latitude | latitude | esriFieldTypeDouble | sqlTypeNumeric\n"]}], "source": ["for field in cities_fields:\n", " print(f\"{field.name:13}| {field.alias:13}| {field.type:25}| {field.sqlType}\")"]}, {"cell_type": "markdown", "metadata": {}, "source": ["#### Preparing additional columns to add to the feature layer\n", "Now that we have an idea of how the fields are defined, we can go ahead and append new fields to the layer's definition. Once we compose a list of new fields, by calling the `add_to_definition()` method we can push those changes to the feature layer. Once the feature layer's definition is updated with new fields, we can loop through each feature and add the appropriate attribute values.\n", "\n", "To compose a list of new fields to be added, we start by making a copy of one of the fields as a template and start editing it. One easy part in this example is, all new fields that need to be added except one, are of the same data type: integer. With your data, this may not be the case. In such instances, you can add each field individually."]}, {"cell_type": "code", "execution_count": 30, "metadata": {}, "outputs": [{"data": {"text/plain": ["{'name': 'city_id',\n", " 'type': 'esriFieldTypeInteger',\n", " 'alias': 'city_id',\n", " 'domain': None,\n", " 'editable': True,\n", " 'nullable': True,\n", " 'sqlType': 'sqlTypeInteger',\n", " 'length': 10}"]}, "execution_count": 30, "metadata": {}, "output_type": "execute_result"}], "source": ["# get a template field\n", "template_field = dict(deepcopy(cities_fields[1]))\n", "template_field"]}, {"cell_type": "markdown", "metadata": {}, "source": ["Let us use pandas to get the list of fields that are **new** in spread sheet 3"]}, {"cell_type": "code", "execution_count": 31, "metadata": {}, "outputs": [{"data": {"text/plain": ["['ameri_es',\n", " 'asian',\n", " 'black',\n", " 'class',\n", " 'females',\n", " 'hawn_pl',\n", " 'hispanic',\n", " 'males',\n", " 'white']"]}, "execution_count": 31, "metadata": {}, "output_type": "execute_result"}], "source": ["# get the list of new fields to add from the third spreadsheet, that are not in spread sheets 1,2\n", "new_field_names = list(cities_df_3.columns.difference(cities_df_1.columns))\n", "new_field_names"]}, {"cell_type": "markdown", "metadata": {}, "source": ["Now loop though each new field name and create a field dictionary using the template we created earlier. Except the field titled `class` all other fields are of type `integer`."]}, {"cell_type": "code", "execution_count": 32, "metadata": {}, "outputs": [{"data": {"text/plain": ["9"]}, "execution_count": 32, "metadata": {}, "output_type": "execute_result"}], "source": ["fields_to_be_added = []\n", "for new_field_name in new_field_names:\n", " current_field = deepcopy(template_field)\n", " if new_field_name.lower() == 'class':\n", " current_field['sqlType'] = 'sqlTypeVarchar'\n", " current_field['type'] = 'esriFieldTypeString'\n", " current_field['length'] = 8000\n", " current_field['name'] = new_field_name.lower()\n", " current_field['alias'] = new_field_name\n", " fields_to_be_added.append(current_field)\n", " \n", "len(fields_to_be_added)"]}, {"cell_type": "code", "execution_count": 33, "metadata": {}, "outputs": [{"data": {"text/plain": ["{'name': 'class',\n", " 'type': 'esriFieldTypeString',\n", " 'alias': 'class',\n", " 'domain': None,\n", " 'editable': True,\n", " 'nullable': True,\n", " 'sqlType': 'sqlTypeVarchar',\n", " 'length': 8000}"]}, "execution_count": 33, "metadata": {}, "output_type": "execute_result"}], "source": ["#inspect one of the fields\n", "fields_to_be_added[3]"]}, {"cell_type": "markdown", "metadata": {}, "source": ["#### Adding additional fields to the feature layer\n", "The list of new fields we composed can be pushed to the server by calling `add_to_definition()` method on the `manager` property."]}, {"cell_type": "code", "execution_count": 34, "metadata": {}, "outputs": [{"data": {"text/plain": ["{'success': True}"]}, "execution_count": 34, "metadata": {}, "output_type": "execute_result"}], "source": ["cities_flayer.manager.add_to_definition({'fields':fields_to_be_added})"]}, {"cell_type": "markdown", "metadata": {}, "source": ["Thus, we have successfully added new fields to our feature layer. Let us verify the new columns show up:"]}, {"cell_type": "code", "execution_count": 35, "metadata": {}, "outputs": [{"data": {"text/plain": ["18"]}, "execution_count": 35, "metadata": {}, "output_type": "execute_result"}], "source": ["new_cities_fields = cities_flayer.manager.properties.fields\n", "len(new_cities_fields)"]}, {"cell_type": "code", "execution_count": 36, "metadata": {}, "outputs": [{"name": "stdout", "output_type": "stream", "text": ["objectid | esriFieldTypeOID\n", "city_id | esriFieldTypeInteger\n", "name | esriFieldTypeString\n", "state | esriFieldTypeString\n", "capital | esriFieldTypeString\n", "pop2000 | esriFieldTypeInteger\n", "pop2007 | esriFieldTypeInteger\n", "longitude | esriFieldTypeDouble\n", "latitude | esriFieldTypeDouble\n", "ameri_es | esriFieldTypeInteger\n", "asian | esriFieldTypeInteger\n", "black | esriFieldTypeInteger\n", "class | esriFieldTypeString\n", "females | esriFieldTypeInteger\n", "hawn_pl | esriFieldTypeInteger\n", "hispanic | esriFieldTypeInteger\n", "males | esriFieldTypeInteger\n", "white | esriFieldTypeInteger\n"]}], "source": ["for field in new_cities_fields:\n", " print(f\"{field.name:10}| {field.type}\")"]}, {"cell_type": "markdown", "metadata": {}, "source": ["#### Adding attribute values to the new columns\n", "Next we can loop through each row in the third csv and add the new attribute values for these newly created columns."]}, {"cell_type": "code", "execution_count": 37, "metadata": {}, "outputs": [], "source": ["# Run a fresh query on the feature layer so it includes the new features from\n", "# csv2 and new columns from csv3\n", "cities_fset2 = cities_flayer.query()\n", "cities_features2 = cities_fset2.features"]}, {"cell_type": "markdown", "metadata": {}, "source": ["Loop through each row in the third spreadsheet, find the corresponding feature by matching the `city_id` value and apply the attribute values for the new fields."]}, {"cell_type": "code", "execution_count": 38, "metadata": {}, "outputs": [{"name": "stdout", "output_type": "stream", "text": ["1 Adding additional attributes for: Honolulu\n", "2 Adding additional attributes for: Juneau\n", "3 Adding additional attributes for: Boise City\n", "4 Adding additional attributes for: Olympia\n", "5 Adding additional attributes for: Salem\n", "6 Adding additional attributes for: Carson\n", "7 Adding additional attributes for: Sacramento\n", "8 Adding additional attributes for: Phoenix\n", "9 Adding additional attributes for: Salt Lake City\n", "10 Adding additional attributes for: Cheyenne\n", "11 Adding additional attributes for: Denver\n", "12 Adding additional attributes for: Santa Fe\n", "13 Adding additional attributes for: Oklahoma City\n", "14 Adding additional attributes for: Topeka\n", "15 Adding additional attributes for: Lincoln\n", "16 Adding additional attributes for: Des Moines\n", "17 Adding additional attributes for: Jefferson City\n", "18 Adding additional attributes for: Little Rock\n", "19 Adding additional attributes for: Austin\n", "20 Adding additional attributes for: Baton Rouge\n", "21 Adding additional attributes for: Helena\n", "22 Adding additional attributes for: Bismarck\n", "23 Adding additional attributes for: Pierre\n", "24 Adding additional attributes for: St. Paul\n", "25 Adding additional attributes for: Madison\n", "26 Adding additional attributes for: Lansing\n", "27 Adding additional attributes for: Augusta\n", "28 Adding additional attributes for: Montpelier\n", "29 Adding additional attributes for: Albany\n", "30 Adding additional attributes for: Boston\n", "31 Adding additional attributes for: Concord\n", "32 Adding additional attributes for: Tallahassee\n", "33 Adding additional attributes for: Jackson\n", "34 Adding additional attributes for: Nashville\n", "35 Adding additional attributes for: Montgomery\n", "36 Adding additional attributes for: Springfield\n", "37 Adding additional attributes for: Indianapolis\n", "38 Adding additional attributes for: Frankfort\n", "39 Adding additional attributes for: Columbus\n", "40 Adding additional attributes for: Charleston\n", "41 Adding additional attributes for: Atlanta\n", "42 Adding additional attributes for: Columbia\n", "43 Adding additional attributes for: Raleigh\n", "44 Adding additional attributes for: Richmond\n", "45 Adding additional attributes for: Harrisburg\n", "46 Adding additional attributes for: Trenton\n", "47 Adding additional attributes for: Dover\n", "48 Adding additional attributes for: Washington\n", "49 Adding additional attributes for: Annapolis\n", "50 Adding additional attributes for: Hartford\n", "51 Adding additional attributes for: Providence\n"]}], "source": ["features_for_update = []\n", "for city_id in cities_df_3['city_id']:\n", " # get the matching row from csv\n", " matching_row = cities_df_3.where(cities_df_3.city_id == city_id).dropna()\n", " \n", " print(str(city_id) + \" Adding additional attributes for: \" + matching_row['name'].values[0])\n", " \n", " # get the feature to be updated\n", " original_feature = [f for f in cities_features2 if f.attributes['city_id'] == city_id][0]\n", " feature_to_be_updated = deepcopy(original_feature)\n", " \n", " # assign the updated values\n", " feature_to_be_updated.attributes['class'] = matching_row['class'].values[0]\n", " feature_to_be_updated.attributes['white'] = int(matching_row['white'])\n", " feature_to_be_updated.attributes['black'] = int(matching_row['black'])\n", " feature_to_be_updated.attributes['ameri_es'] = int(matching_row['ameri_es'])\n", " feature_to_be_updated.attributes['asian'] = int(matching_row['asian'])\n", " feature_to_be_updated.attributes['hawn_pl'] = int(matching_row['hawn_pl'])\n", " feature_to_be_updated.attributes['hispanic'] = int(matching_row['hispanic'])\n", " feature_to_be_updated.attributes['males'] = int(matching_row['males'])\n", " feature_to_be_updated.attributes['females'] = int(matching_row['females'])\n", " \n", " #add this to the list of features to be updated\n", " features_for_update.append(feature_to_be_updated)"]}, {"cell_type": "code", "execution_count": 39, "metadata": {}, "outputs": [{"data": {"text/plain": ["{\"geometry\": {\"x\": -7950674.8190312665, \"y\": 5134585.3226086125}, \"attributes\": {\"objectid\": 51, \"city_id\": 51, \"name\": \"Providence\", \"state\": \"RI\", \"capital\": \"State\", \"pop2000\": 173618, \"pop2007\": 183731, \"longitude\": -71.42212709, \"latitude\": 41.82355466, \"ameri_es\": 1975, \"asian\": 10432, \"black\": 25243, \"class\": \"City\", \"females\": 90583, \"hawn_pl\": 270, \"hispanic\": 52146, \"males\": 83035, \"white\": 94666}}"]}, "execution_count": 39, "metadata": {}, "output_type": "execute_result"}], "source": ["# inspect one of the features\n", "features_for_update[-1]"]}, {"cell_type": "code", "execution_count": 40, "metadata": {}, "outputs": [{"data": {"text/plain": ["{'addResults': [],\n", " 'updateResults': [{'objectId': 1, 'globalId': None, 'success': True},\n", " {'objectId': 2, 'globalId': None, 'success': True},\n", " {'objectId': 3, 'globalId': None, 'success': True},\n", " {'objectId': 4, 'globalId': None, 'success': True},\n", " {'objectId': 5, 'globalId': None, 'success': True},\n", " {'objectId': 6, 'globalId': None, 'success': True},\n", " {'objectId': 7, 'globalId': None, 'success': True},\n", " {'objectId': 8, 'globalId': None, 'success': True},\n", " {'objectId': 9, 'globalId': None, 'success': True},\n", " {'objectId': 10, 'globalId': None, 'success': True},\n", " {'objectId': 11, 'globalId': None, 'success': True},\n", " {'objectId': 12, 'globalId': None, 'success': True},\n", " {'objectId': 13, 'globalId': None, 'success': True},\n", " {'objectId': 14, 'globalId': None, 'success': True},\n", " {'objectId': 15, 'globalId': None, 'success': True},\n", " {'objectId': 16, 'globalId': None, 'success': True},\n", " {'objectId': 17, 'globalId': None, 'success': True},\n", " {'objectId': 18, 'globalId': None, 'success': True},\n", " {'objectId': 19, 'globalId': None, 'success': True},\n", " {'objectId': 20, 'globalId': None, 'success': True},\n", " {'objectId': 21, 'globalId': None, 'success': True},\n", " {'objectId': 22, 'globalId': None, 'success': True},\n", " {'objectId': 23, 'globalId': None, 'success': True},\n", " {'objectId': 24, 'globalId': None, 'success': True},\n", " {'objectId': 25, 'globalId': None, 'success': True},\n", " {'objectId': 26, 'globalId': None, 'success': True},\n", " {'objectId': 27, 'globalId': None, 'success': True},\n", " {'objectId': 28, 'globalId': None, 'success': True},\n", " {'objectId': 29, 'globalId': None, 'success': True},\n", " {'objectId': 30, 'globalId': None, 'success': True},\n", " {'objectId': 31, 'globalId': None, 'success': True},\n", " {'objectId': 32, 'globalId': None, 'success': True},\n", " {'objectId': 33, 'globalId': None, 'success': True},\n", " {'objectId': 34, 'globalId': None, 'success': True},\n", " {'objectId': 35, 'globalId': None, 'success': True},\n", " {'objectId': 36, 'globalId': None, 'success': True},\n", " {'objectId': 37, 'globalId': None, 'success': True},\n", " {'objectId': 38, 'globalId': None, 'success': True},\n", " {'objectId': 39, 'globalId': None, 'success': True},\n", " {'objectId': 40, 'globalId': None, 'success': True},\n", " {'objectId': 41, 'globalId': None, 'success': True},\n", " {'objectId': 42, 'globalId': None, 'success': True},\n", " {'objectId': 43, 'globalId': None, 'success': True},\n", " {'objectId': 44, 'globalId': None, 'success': True},\n", " {'objectId': 45, 'globalId': None, 'success': True},\n", " {'objectId': 46, 'globalId': None, 'success': True},\n", " {'objectId': 47, 'globalId': None, 'success': True},\n", " {'objectId': 48, 'globalId': None, 'success': True},\n", " {'objectId': 49, 'globalId': None, 'success': True},\n", " {'objectId': 50, 'globalId': None, 'success': True},\n", " {'objectId': 51, 'globalId': None, 'success': True}],\n", " 'deleteResults': [],\n", " 'attachments': {'addResults': [], 'updateResults': [], 'deleteResults': []}}"]}, "execution_count": 40, "metadata": {}, "output_type": "execute_result"}], "source": ["# apply the edits to the feature layer\n", "cities_flayer.edit_features(updates= features_for_update)"]}, {"cell_type": "markdown", "metadata": {}, "source": ["#### Verify the changes made so far\n", "Let us run another query on the feature layer and visualize a few rows."]}, {"cell_type": "code", "execution_count": 41, "metadata": {}, "outputs": [{"data": {"text/html": ["
\n", "\n", "\n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", "
SHAPEameri_esasianblackcapitalcity_idclassfemaleshawn_plhispaniclatitudelongitudemalesnameobjectidpop2000pop2007statewhite
0{\"x\": -17568824.553, \"y\": 2428377.352700006, \"...6892075886038State1Census Designated Place189029254571622921.305782-157.823436182628Honolulu1371657378587HI73093
1{\"x\": -14973760.769500002, \"y\": 8041504.674200...34961438248State2City and Borough15242116104058.351418-134.51158215469Juneau23071131592AK22969
2{\"x\": -12939516.521100001, \"y\": 5405860.248099...130038701437State3City93773302841043.613736-116.23765592014Boise City3185787203529ID171204
3{\"x\": -13680394.263852797, \"y\": 5949000.547924...5532473805State4City22195125186347.042418-122.89307320319Olympia44251445523WA36246
4{\"x\": -13695542.842799995, \"y\": 5610682.544100...206433041750State5City681726431997344.931109-123.02915568752Salem5136924152039OR113746
\n", "
"], "text/plain": [" SHAPE ameri_es asian black \\\n", "0 {\"x\": -17568824.553, \"y\": 2428377.352700006, \"... 689 207588 6038 \n", "1 {\"x\": -14973760.769500002, \"y\": 8041504.674200... 3496 1438 248 \n", "2 {\"x\": -12939516.521100001, \"y\": 5405860.248099... 1300 3870 1437 \n", "3 {\"x\": -13680394.263852797, \"y\": 5949000.547924... 553 2473 805 \n", "4 {\"x\": -13695542.842799995, \"y\": 5610682.544100... 2064 3304 1750 \n", "\n", " capital city_id class females hawn_pl hispanic \\\n", "0 State 1 Census Designated Place 189029 25457 16229 \n", "1 State 2 City and Borough 15242 116 1040 \n", "2 State 3 City 93773 302 8410 \n", "3 State 4 City 22195 125 1863 \n", "4 State 5 City 68172 643 19973 \n", "\n", " latitude longitude males name objectid pop2000 pop2007 \\\n", "0 21.305782 -157.823436 182628 Honolulu 1 371657 378587 \n", "1 58.351418 -134.511582 15469 Juneau 2 30711 31592 \n", "2 43.613736 -116.237655 92014 Boise City 3 185787 203529 \n", "3 47.042418 -122.893073 20319 Olympia 4 42514 45523 \n", "4 44.931109 -123.029155 68752 Salem 5 136924 152039 \n", "\n", " state white \n", "0 HI 73093 \n", "1 AK 22969 \n", "2 ID 171204 \n", "3 WA 36246 \n", "4 OR 113746 "]}, "execution_count": 41, "metadata": {}, "output_type": "execute_result"}], "source": ["cities_fset3 = cities_flayer.query()\n", "cities_fset3.sdf.head(5)"]}, {"cell_type": "markdown", "metadata": {}, "source": ["## Conclusion\n", "In this sample, we observed an edit intensive method to keep feature layers updated. We published data from first spreadsheet as a feature layer. We then updated existing features from second spread sheet (used geometry module to project the coordinates in the process), and added new features. The third spreadsheet presented additional attribute columns which were added to the feature layer by editing its definition and then updating the features with this additional data.\n", "\n", "This method is editing intensive and you may choose this when the number of features to edit is less or if you needed to selectively update certain features as updates come in.\n", "\n", "An alternate method is to overwrite the feature layer altogether when you always have current information coming in. This method is explained in the sample [Overwriting feature layers](https://developers.arcgis.com/python/sample-notebooks/overwriting-feature-layers)"]}], "metadata": {"anaconda-cloud": {}, "esriNotebookRuntime": {"notebookRuntimeName": "ArcGIS Notebook Python 3 Standard", "notebookRuntimeVersion": "4.0"}, "kernelspec": {"display_name": "Python 3", "language": "python", "name": "python3"}, "language_info": {"codemirror_mode": {"name": "ipython", "version": 3}, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.8.2"}, "toc": {"base_numbering": 1, "nav_menu": {}, "number_sections": true, "sideBar": true, "skip_h1_title": false, "title_cell": "Table of Contents", "title_sidebar": "Contents", "toc_cell": true, "toc_position": {}, "toc_section_display": true, "toc_window_display": true}}, "nbformat": 4, "nbformat_minor": 1} \ No newline at end of file +{"cells":[{"cell_type":"markdown","metadata":{},"source":["# Updating features in a feature layer\n","\n","As content publishers, you may be required to keep certain web layers up to date. As new data arrives, you may have to append new features, update existing features etc. There are a couple of different options to accomplish this:\n"," \n"," - Method 1: editing individual features as updated datasets are available\n"," - Method 2: overwriting feature layers altogether with updated datasets\n"," \n","Depending on the number of features that are updated, your workflow requirements, you may adopt either or both kinds of update mechanisms.\n","\n","In this sample, we explore the first method. For **Method 2**, refer to the sample titled [Overwriting feature layers](https://developers.arcgis.com/python/sample-notebooks/overwriting-feature-layers)\n","\n","**Note**: To run this sample, you need the ``pandas`` library in your conda environment. If you don't have the library, install it by running the following command from cmd.exe or your shell\n","```\n","conda install pandas```"]},{"cell_type":"markdown","metadata":{"toc":true},"source":["

Table of Contents

\n",""]},{"cell_type":"code","execution_count":3,"metadata":{},"outputs":[],"source":["# Connect to the GIS\n","from arcgis.gis import GIS\n","from arcgis import features\n","import pandas as pd"]},{"cell_type":"code","execution_count":4,"metadata":{},"outputs":[{"name":"stdout","output_type":"stream","text":["Enter password: ········\n"]}],"source":["#Access the portal using \"amazing_arcgis_123\" as password for the given Username. \n","gis = GIS(\"https://pythonapi.playground.esri.com/portal\")"]},{"cell_type":"markdown","metadata":{},"source":["## Updating feature layer by editing individual features\n","\n","Let us consider a scenario where we need to update a feature layer containing the capital cities of the US. We have 3 csv datasets simulating an update workflow as described below:\n","\n"," 1. capitals_1.csv -- contains the initial, incomplete dataset\n"," 2. capitals_2.csv -- contains additional points and updates to existing points, building on top of capitals_1.csv\n"," 3. capitals_annex.csv -- an alternate table containing additional attribute information\n"," \n","Our goal is to update the feature layer with each of these datasets doing the necessary edit operations."]},{"cell_type":"markdown","metadata":{},"source":["### Publish the cities feature layer using the initial dataset"]},{"cell_type":"code","execution_count":5,"metadata":{},"outputs":[{"data":{"text/html":["
\n","\n","\n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n","
city_idnamestatecapitalpop2000pop2007longitudelatitude
01HonoluluHIState371657378587-157.82343621.305782
12JuneauAKState3071131592-134.51158258.351418
23Boise CityIDState185787203529-116.23765543.613736
34OlympiaWAState2751445523-122.89307347.042418
45SalemORState136924152039-123.02915544.931109
\n","
"],"text/plain":[" city_id name state capital pop2000 pop2007 longitude latitude\n","0 1 Honolulu HI State 371657 378587 -157.823436 21.305782\n","1 2 Juneau AK State 30711 31592 -134.511582 58.351418\n","2 3 Boise City ID State 185787 203529 -116.237655 43.613736\n","3 4 Olympia WA State 27514 45523 -122.893073 47.042418\n","4 5 Salem OR State 136924 152039 -123.029155 44.931109"]},"execution_count":5,"metadata":{},"output_type":"execute_result"}],"source":["# read the initial csv\n","csv1 = 'data/updating_gis_content/capitals_1.csv'\n","cities_df_1 = pd.read_csv(csv1)\n","cities_df_1.head()"]},{"cell_type":"code","execution_count":6,"metadata":{},"outputs":[{"data":{"text/plain":["(19, 8)"]},"execution_count":6,"metadata":{},"output_type":"execute_result"}],"source":["# print the number of records in this csv\n","cities_df_1.shape"]},{"cell_type":"markdown","metadata":{},"source":["As you can see, this dataset only contains 19 rows or 19 capital cities. It is not the complete dataset."]},{"cell_type":"markdown","metadata":{},"source":["Let's add this `csv` as a portal item. Adding the item creates a CSV item and uploads the original file to the portal, establishing a link between the item and the original file name. Therefore, we need a unique name for the file to guarantee it does not collide with any file of the same name that may have been uploaded by the same user. We'll use standard library modules to copy the file and give it a new name so we can add it to the portal"]},{"cell_type":"code","execution_count":7,"metadata":{},"outputs":[],"source":["import os\n","import datetime as dt\n","import shutil\n","\n","# assign variables to locations on the file system \n","cwd = os.path.abspath(os.getcwd())\n","data_pth = os.path.join(cwd, r'data/updating_gis_content/')\n","\n","# create a unique timestamp string to append to the file name\n","now_ts = str(int(dt.datetime.now().timestamp()))\n","\n","# copy the file, appending the unique string and assign it to a variable\n","my_csv = shutil.copyfile(os.path.abspath(csv1),\n"," os.path.join(data_pth, 'capitals_1_' + now_ts + '.csv'))"]},{"cell_type":"code","execution_count":8,"metadata":{},"outputs":[{"data":{"text/plain":["'\\\\path\\\\on\\\\computer\\\\05_content_publishers\\\\data/updating_gis_content/capitals_1_1548875507.csv'"]},"execution_count":8,"metadata":{},"output_type":"execute_result"}],"source":["my_csv"]},{"cell_type":"code","execution_count":9,"metadata":{},"outputs":[{"data":{"text/html":["
\n","
\n"," \n"," \n"," \n","
\n","\n","
\n"," USA Capitals spreadsheet 1548875507\n"," \n","
CSV by arcgis_python\n","
Last Modified: January 30, 2019\n","
0 comments, 0 views\n","
\n","
\n"," "],"text/plain":[""]},"execution_count":9,"metadata":{},"output_type":"execute_result"}],"source":["# add the initial csv file and publish that as a web layer\n","item_prop = {'title':'USA Capitals spreadsheet ' + now_ts}\n","root_folder = gis.content.folders.get()\n","csv_item = root_folder.add(item_properties=item_prop, file=my_csv)\n","csv_item"]},{"cell_type":"markdown","metadata":{},"source":["This spreadsheet has co-ordinates as `latitude` and `longitude` columns which will be used for geometries during publishing."]},{"cell_type":"code","execution_count":10,"metadata":{},"outputs":[{"data":{"text/html":["
\n","
\n"," \n"," \n"," \n","
\n","\n","
\n"," USA Capitals spreadsheet 1548875507\n"," \n","
Feature Layer Collection by arcgis_python\n","
Last Modified: January 30, 2019\n","
0 comments, 0 views\n","
\n","
\n"," "],"text/plain":[""]},"execution_count":10,"metadata":{},"output_type":"execute_result"}],"source":["# publish the csv item into a feature layer\n","cities_item = csv_item.publish()\n","cities_item"]},{"cell_type":"code","execution_count":19,"metadata":{},"outputs":[{"data":{"text/html":["
\n","
\n"," \n"," \n"," \n","
\n","\n","
\n"," USA Capitals\n"," \n","
Feature Layer Collection by arcgis_python\n","
Last Modified: January 30, 2019\n","
0 comments, 0 views\n","
\n","
\n"," "],"text/plain":[""]},"execution_count":19,"metadata":{},"output_type":"execute_result"}],"source":["# update the item metadata\n","item_prop = {'title':'USA Capitals'}\n","cities_item.update(item_properties = item_prop, thumbnail='data/updating_gis_content/capital_cities.png')\n","cities_item"]},{"cell_type":"markdown","metadata":{},"source":["### Apply updates from the second spreadsheet\n","The next set of updates have arrived and are stored in `capitals_2.csv`. We are told it contains corrections for the original set in addition to new features. We need to figure out which rows have changed, apply `update` operation on those, then apply `add` operation to new rows.\n","\n","To start with, let us read the second csv file. Note, in this sample, data is stored in csv. In reality, it could be from your enterprise database or any other data source."]},{"cell_type":"code","execution_count":11,"metadata":{},"outputs":[{"data":{"text/html":["
\n","\n","\n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n","
city_idnamestatecapitalpop2000pop2007longitudelatitude
020Baton RougeLAState227818228810-91.14022730.458091
121HelenaMTState2578026007-112.02702746.595809
222BismarckNDState5553259344-100.77900046.813346
323PierreSDState1387614169-100.33638244.367964
424St. PaulMNState287151291643-93.11411844.954364
\n","
"],"text/plain":[" city_id name state capital pop2000 pop2007 longitude latitude\n","0 20 Baton Rouge LA State 227818 228810 -91.140227 30.458091\n","1 21 Helena MT State 25780 26007 -112.027027 46.595809\n","2 22 Bismarck ND State 55532 59344 -100.779000 46.813346\n","3 23 Pierre SD State 13876 14169 -100.336382 44.367964\n","4 24 St. Paul MN State 287151 291643 -93.114118 44.954364"]},"execution_count":11,"metadata":{},"output_type":"execute_result"}],"source":["# read the second csv set\n","csv2 = 'data/updating_gis_content/capitals_2.csv'\n","cities_df_2 = pd.read_csv(csv2)\n","cities_df_2.head()"]},{"cell_type":"code","execution_count":12,"metadata":{},"outputs":[{"data":{"text/plain":["(36, 8)"]},"execution_count":12,"metadata":{},"output_type":"execute_result"}],"source":["# get the dimensions of this csv\n","cities_df_2.shape"]},{"cell_type":"markdown","metadata":{},"source":["#### Identifying existing features that need to be updated\n","\n","To identify features that need to be updated, let us read the attribute table of the published feature layer and compare that against the second csv. To read the attribute table, we perform a `query()` on the feature layer which returns us an `arcgis.feature.FeatureSet` object. Refer to the guide pages on [accessing features from feature layers](https://developers.arcgis.com/python/guide/working-with-feature-layers-and-features/) to learn more about this.\n","\n","Note, at this point, we could work with the `cities_df_1` dataframe we created from the original csv file. However, in practice you may not always have the original dataset or your feature layer might have undergone edits after it was published. Hence, we query the feature layer directly."]},{"cell_type":"code","execution_count":13,"metadata":{},"outputs":[{"data":{"text/html":["
\n","\n","\n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n","
SHAPEcapitalcity_idlatitudelongitudenameobjectidpop2000pop2007state
0{\"x\": -17568824.553, \"y\": 2428377.352700006, \"...State121.305782-157.823436Honolulu1371657378587HI
1{\"x\": -14973760.769500002, \"y\": 8041504.674200...State258.351418-134.511582Juneau23071131592AK
2{\"x\": -12939516.521100001, \"y\": 5405860.248099...State343.613736-116.237655Boise City3185787203529ID
3{\"x\": -13680394.263900002, \"y\": 5949000.547900...State447.042418-122.893073Olympia42751445523WA
4{\"x\": -13695542.842799995, \"y\": 5610682.544100...State544.931109-123.029155Salem5136924152039OR
\n","
"],"text/plain":[" SHAPE capital city_id \\\n","0 {\"x\": -17568824.553, \"y\": 2428377.352700006, \"... State 1 \n","1 {\"x\": -14973760.769500002, \"y\": 8041504.674200... State 2 \n","2 {\"x\": -12939516.521100001, \"y\": 5405860.248099... State 3 \n","3 {\"x\": -13680394.263900002, \"y\": 5949000.547900... State 4 \n","4 {\"x\": -13695542.842799995, \"y\": 5610682.544100... State 5 \n","\n"," latitude longitude name objectid pop2000 pop2007 state \n","0 21.305782 -157.823436 Honolulu 1 371657 378587 HI \n","1 58.351418 -134.511582 Juneau 2 30711 31592 AK \n","2 43.613736 -116.237655 Boise City 3 185787 203529 ID \n","3 47.042418 -122.893073 Olympia 4 27514 45523 WA \n","4 44.931109 -123.029155 Salem 5 136924 152039 OR "]},"execution_count":13,"metadata":{},"output_type":"execute_result"}],"source":["cities_flayer = cities_item.layers[0]\n","cities_fset = cities_flayer.query() #querying without any conditions returns all the features\n","cities_fset.sdf.head()"]},{"cell_type":"markdown","metadata":{},"source":["The `city_id` column is common between both the datasets. Next, let us perform an `inner` join with the table from feature layer as left and updated csv as right. Inner joins will yield those rows that are present in both tables. Learn more about [inner joins here](https://www.w3schools.com/sql/sql_join_inner.asp)."]},{"cell_type":"code","execution_count":14,"metadata":{},"outputs":[{"data":{"text/html":["
\n","\n","\n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n","
SHAPEcapital_xcity_idlatitude_xlongitude_xname_xobjectidpop2000_xpop2007_xstate_xname_ystate_ycapital_ypop2000_ypop2007_ylongitude_ylatitude_y
0{'x': -13680394.263900002, 'y': 5949000.547900...State447.042418-122.893073Olympia42751445523WAOlympiaWAState4251445523-122.89307347.042418
1{'x': 11666510.350300007, 'y': -5033833.302499...State10-41.145545104.802046Cheyenne105301154750WYCheyenneWYState5301154750-104.80204641.145545
2{'x': -10857545.543799996, 'y': 4229619.674200...State1335.482309-97.534991Oklahoma City13506132552556OKKOklahoma CityOKState506132552556-97.53499135.482309
3{'x': -10421826.864700003, 'y': 5099899.263700...State1641.590936-93.620864Des Moines16200682201257IADes MoinesIAState198682201257-93.62086441.590936
\n","
"],"text/plain":[" SHAPE capital_x city_id \\\n","0 {'x': -13680394.263900002, 'y': 5949000.547900... State 4 \n","1 {'x': 11666510.350300007, 'y': -5033833.302499... State 10 \n","2 {'x': -10857545.543799996, 'y': 4229619.674200... State 13 \n","3 {'x': -10421826.864700003, 'y': 5099899.263700... State 16 \n","\n"," latitude_x longitude_x name_x objectid pop2000_x pop2007_x \\\n","0 47.042418 -122.893073 Olympia 4 27514 45523 \n","1 -41.145545 104.802046 Cheyenne 10 53011 54750 \n","2 35.482309 -97.534991 Oklahoma City 13 506132 552556 \n","3 41.590936 -93.620864 Des Moines 16 200682 201257 \n","\n"," state_x name_y state_y capital_y pop2000_y pop2007_y longitude_y \\\n","0 WA Olympia WA State 42514 45523 -122.893073 \n","1 WY Cheyenne WY State 53011 54750 -104.802046 \n","2 OKK Oklahoma City OK State 506132 552556 -97.534991 \n","3 IA Des Moines IA State 198682 201257 -93.620864 \n","\n"," latitude_y \n","0 47.042418 \n","1 41.145545 \n","2 35.482309 \n","3 41.590936 "]},"execution_count":14,"metadata":{},"output_type":"execute_result"}],"source":["overlap_rows = pd.merge(left = cities_fset.sdf, right = cities_df_2, how='inner',\n"," on = 'city_id')\n","overlap_rows"]},{"cell_type":"markdown","metadata":{},"source":["Thus, of 19 features in original and 36 features in second csv, 4 features are common. Inspecting the table, we find certain columns are updated, for instance, Cheyenne has its coordinates corrected, Oklahoma City has its state abbreviation corrected and similarly other cities have one of their attribute columns updated.\n","\n","We could either update individual attribute values for these 4 features or update all attribute values with the latest csv. Below, we are performing the latter as it is simple and fast.\n","\n","#### Perform updates to the existing features"]},{"cell_type":"code","execution_count":15,"metadata":{},"outputs":[],"source":["features_for_update = [] #list containing corrected features\n","all_features = cities_fset.features"]},{"cell_type":"code","execution_count":16,"metadata":{},"outputs":[{"data":{"text/plain":["{\"geometry\": {\"x\": -17568824.553, \"y\": 2428377.352700006, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}, \"attributes\": {\"objectid\": 1, \"city_id\": 1, \"name\": \"Honolulu\", \"state\": \"HI\", \"capital\": \"State\", \"pop2000\": 371657, \"pop2007\": 378587, \"longitude\": -157.8234362, \"latitude\": 21.30578163, \"SHAPE\": {\"x\": -17568824.553, \"y\": 2428377.352700006, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}}}"]},"execution_count":16,"metadata":{},"output_type":"execute_result"}],"source":["# inspect one of the features\n","all_features[0]"]},{"cell_type":"markdown","metadata":{},"source":["Note the X and Y geometry values are different from decimal degree coordinates present in Longitude and Latitude fields. To perform geometry edits, we need to project the coordinates to match that of the feature layer."]},{"cell_type":"code","execution_count":17,"metadata":{},"outputs":[{"data":{"text/plain":["{'wkid': 102100, 'latestWkid': 3857}"]},"execution_count":17,"metadata":{},"output_type":"execute_result"}],"source":["# get the spatial reference of the features since we need to update the geometry\n","cities_fset.spatial_reference"]},{"cell_type":"markdown","metadata":{},"source":["Below, we prepare updated geometries and attributes for each of the 4 features we determined above. We use the `arcgis.geometry` module to `project` the coordinates from geographic to projected coordinate system. The cell below prints the original `Feature` objects followed by the updated ones. If you look closely, you can find the differences."]},{"cell_type":"code","execution_count":18,"metadata":{},"outputs":[{"name":"stdout","output_type":"stream","text":["{\"geometry\": {\"x\": -13680394.263900002, \"y\": 5949000.547900006, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}, \"attributes\": {\"objectid\": 4, \"city_id\": 4, \"name\": \"Olympia\", \"state\": \"WA\", \"capital\": \"State\", \"pop2000\": 27514, \"pop2007\": 45523, \"longitude\": -122.8930726, \"latitude\": 47.04241817, \"SHAPE\": {\"x\": -13680394.263900002, \"y\": 5949000.547900006, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}}}\n","{\"geometry\": {\"x\": -13680394.2638528, \"y\": 5949000.54792491}, \"attributes\": {\"objectid\": 4, \"city_id\": 4, \"name\": \"Olympia\", \"state\": \"WA\", \"capital\": \"State\", \"pop2000\": 42514, \"pop2007\": 45523, \"longitude\": -122.8930726, \"latitude\": 47.04241817, \"SHAPE\": {\"x\": -13680394.263900002, \"y\": 5949000.547900006, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}}}\n","========================================================================\n","{\"geometry\": {\"x\": 11666510.350300007, \"y\": -5033833.302499998, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}, \"attributes\": {\"objectid\": 10, \"city_id\": 10, \"name\": \"Cheyenne\", \"state\": \"WY\", \"capital\": \"State\", \"pop2000\": 53011, \"pop2007\": 54750, \"longitude\": 104.8020456, \"latitude\": -41.14554516, \"SHAPE\": {\"x\": 11666510.350300007, \"y\": -5033833.302499998, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}}}\n","{\"geometry\": {\"x\": -11666510.350285435, \"y\": 5033833.302497153}, \"attributes\": {\"objectid\": 10, \"city_id\": 10, \"name\": \"Cheyenne\", \"state\": \"WY\", \"capital\": \"State\", \"pop2000\": 53011, \"pop2007\": 54750, \"longitude\": -104.80204559999999, \"latitude\": 41.14554516, \"SHAPE\": {\"x\": 11666510.350300007, \"y\": -5033833.302499998, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}}}\n","========================================================================\n","{\"geometry\": {\"x\": -10857545.543799996, \"y\": 4229619.674200006, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}, \"attributes\": {\"objectid\": 13, \"city_id\": 13, \"name\": \"Oklahoma City\", \"state\": \"OKK\", \"capital\": \"State\", \"pop2000\": 506132, \"pop2007\": 552556, \"longitude\": -97.5349911, \"latitude\": 35.48230867, \"SHAPE\": {\"x\": -10857545.543799996, \"y\": 4229619.674200006, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}}}\n","{\"geometry\": {\"x\": -10857545.54377847, \"y\": 4229619.674165817}, \"attributes\": {\"objectid\": 13, \"city_id\": 13, \"name\": \"Oklahoma City\", \"state\": \"OK\", \"capital\": \"State\", \"pop2000\": 506132, \"pop2007\": 552556, \"longitude\": -97.5349911, \"latitude\": 35.48230867, \"SHAPE\": {\"x\": -10857545.543799996, \"y\": 4229619.674200006, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}}}\n","========================================================================\n","{\"geometry\": {\"x\": -10421826.864700003, \"y\": 5099899.263700008, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}, \"attributes\": {\"objectid\": 16, \"city_id\": 16, \"name\": \"Des Moines\", \"state\": \"IA\", \"capital\": \"State\", \"pop2000\": 200682, \"pop2007\": 201257, \"longitude\": -93.62086361, \"latitude\": 41.59093617, \"SHAPE\": {\"x\": -10421826.864700003, \"y\": 5099899.263700008, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}}}\n","{\"geometry\": {\"x\": -10421826.864691716, \"y\": 5099899.263692743}, \"attributes\": {\"objectid\": 16, \"city_id\": 16, \"name\": \"Des Moines\", \"state\": \"IA\", \"capital\": \"State\", \"pop2000\": 198682, \"pop2007\": 201257, \"longitude\": -93.62086361, \"latitude\": 41.59093617, \"SHAPE\": {\"x\": -10421826.864700003, \"y\": 5099899.263700008, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}}}\n","========================================================================\n"]}],"source":["from arcgis import geometry #use geometry module to project Long,Lat to X and Y\n","from copy import deepcopy\n","\n","for city_id in overlap_rows['city_id']:\n"," # get the feature to be updated\n"," original_feature = [f for f in all_features if f.attributes['city_id'] == city_id][0]\n"," feature_to_be_updated = deepcopy(original_feature)\n"," \n"," print(str(original_feature))\n"," \n"," # get the matching row from csv\n"," matching_row = cities_df_2.where(cities_df_2.city_id == city_id).dropna()\n"," \n"," #get geometries in the destination coordinate system\n"," input_geometry = {'y':float(matching_row['latitude']),\n"," 'x':float(matching_row['longitude'])}\n"," output_geometry = geometry.project(geometries = [input_geometry],\n"," in_sr = 4326, \n"," out_sr = cities_fset.spatial_reference['latestWkid'],\n"," gis = gis)\n"," \n"," # assign the updated values\n"," feature_to_be_updated.geometry = output_geometry[0]\n"," feature_to_be_updated.attributes['longitude'] = float(matching_row['longitude'])\n"," feature_to_be_updated.attributes['city_id'] = int(matching_row['city_id'])\n"," feature_to_be_updated.attributes['state'] = matching_row['state'].values[0]\n"," feature_to_be_updated.attributes['capital'] = matching_row['capital'].values[0]\n"," feature_to_be_updated.attributes['latitude'] = float(matching_row['latitude'])\n"," feature_to_be_updated.attributes['name'] = matching_row['name'].values[0]\n"," feature_to_be_updated.attributes['pop2000'] = int(matching_row['pop2000'])\n"," feature_to_be_updated.attributes['pop2007'] = int(matching_row['pop2007'])\n"," \n"," #add this to the list of features to be updated\n"," features_for_update.append(feature_to_be_updated)\n"," \n"," print(str(feature_to_be_updated))\n"," print(\"========================================================================\")"]},{"cell_type":"markdown","metadata":{},"source":["We have constructed a list of features with updated values. We can use this list to perform updates on the feature layer."]},{"cell_type":"code","execution_count":19,"metadata":{},"outputs":[{"data":{"text/plain":["[{\"geometry\": {\"x\": -13680394.2638528, \"y\": 5949000.54792491}, \"attributes\": {\"objectid\": 4, \"city_id\": 4, \"name\": \"Olympia\", \"state\": \"WA\", \"capital\": \"State\", \"pop2000\": 42514, \"pop2007\": 45523, \"longitude\": -122.8930726, \"latitude\": 47.04241817, \"SHAPE\": {\"x\": -13680394.263900002, \"y\": 5949000.547900006, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}}},\n"," {\"geometry\": {\"x\": -11666510.350285435, \"y\": 5033833.302497153}, \"attributes\": {\"objectid\": 10, \"city_id\": 10, \"name\": \"Cheyenne\", \"state\": \"WY\", \"capital\": \"State\", \"pop2000\": 53011, \"pop2007\": 54750, \"longitude\": -104.80204559999999, \"latitude\": 41.14554516, \"SHAPE\": {\"x\": 11666510.350300007, \"y\": -5033833.302499998, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}}},\n"," {\"geometry\": {\"x\": -10857545.54377847, \"y\": 4229619.674165817}, \"attributes\": {\"objectid\": 13, \"city_id\": 13, \"name\": \"Oklahoma City\", \"state\": \"OK\", \"capital\": \"State\", \"pop2000\": 506132, \"pop2007\": 552556, \"longitude\": -97.5349911, \"latitude\": 35.48230867, \"SHAPE\": {\"x\": -10857545.543799996, \"y\": 4229619.674200006, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}}},\n"," {\"geometry\": {\"x\": -10421826.864691716, \"y\": 5099899.263692743}, \"attributes\": {\"objectid\": 16, \"city_id\": 16, \"name\": \"Des Moines\", \"state\": \"IA\", \"capital\": \"State\", \"pop2000\": 198682, \"pop2007\": 201257, \"longitude\": -93.62086361, \"latitude\": 41.59093617, \"SHAPE\": {\"x\": -10421826.864700003, \"y\": 5099899.263700008, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}}}]"]},"execution_count":19,"metadata":{},"output_type":"execute_result"}],"source":["features_for_update"]},{"cell_type":"markdown","metadata":{},"source":["To update the feature layer, call the `edit_features()` method of the `FeatureLayer` object and pass the list of features to the `updates` parameter:"]},{"cell_type":"code","execution_count":20,"metadata":{},"outputs":[{"data":{"text/plain":["{'addResults': [],\n"," 'updateResults': [{'objectId': 4, 'globalId': None, 'success': True},\n"," {'objectId': 10, 'globalId': None, 'success': True},\n"," {'objectId': 13, 'globalId': None, 'success': True},\n"," {'objectId': 16, 'globalId': None, 'success': True}],\n"," 'deleteResults': [],\n"," 'attachments': {'addResults': [], 'updateResults': [], 'deleteResults': []}}"]},"execution_count":20,"metadata":{},"output_type":"execute_result"}],"source":["cities_flayer.edit_features(updates= features_for_update)"]},{"cell_type":"markdown","metadata":{},"source":["We have successfully applied corrections to those features which existed in the feature layer from the initial dataset. Next let us proceed to adding new features present only in the second csv file.\n","\n","#### Identifying new features that need to be added"]},{"cell_type":"code","execution_count":21,"metadata":{},"outputs":[{"name":"stdout","output_type":"stream","text":["(32, 8)\n"]}],"source":["#select those rows in the capitals_2.csv that do not overlap with those in capitals_1.csv\n","new_rows = cities_df_2[~cities_df_2['city_id'].isin(overlap_rows['city_id'])]\n","print(new_rows.shape)"]},{"cell_type":"code","execution_count":22,"metadata":{},"outputs":[{"data":{"text/html":["
\n","\n","\n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n","
city_idnamestatecapitalpop2000pop2007longitudelatitude
020Baton RougeLAState227818228810-91.14022730.458091
121HelenaMTState2578026007-112.02702746.595809
222BismarckNDState5553259344-100.77900046.813346
323PierreSDState1387614169-100.33638244.367964
424St. PaulMNState287151291643-93.11411844.954364
\n","
"],"text/plain":[" city_id name state capital pop2000 pop2007 longitude latitude\n","0 20 Baton Rouge LA State 227818 228810 -91.140227 30.458091\n","1 21 Helena MT State 25780 26007 -112.027027 46.595809\n","2 22 Bismarck ND State 55532 59344 -100.779000 46.813346\n","3 23 Pierre SD State 13876 14169 -100.336382 44.367964\n","4 24 St. Paul MN State 287151 291643 -93.114118 44.954364"]},"execution_count":22,"metadata":{},"output_type":"execute_result"}],"source":["new_rows.head()"]},{"cell_type":"markdown","metadata":{},"source":["Thus, of the total 36 rows in the second csv, we have determined the 32 other rows which are new and need to be appended as new features."]},{"cell_type":"markdown","metadata":{},"source":["#### Adding new features\n","Next, let us compose another `list` of `Feature` objects similar to earlier, from the `new_rows` data frame."]},{"cell_type":"code","execution_count":23,"metadata":{},"outputs":[{"name":"stdout","output_type":"stream","text":["Creating Baton Rouge\n","Creating Helena\n","Creating Bismarck\n","Creating Pierre\n","Creating St. Paul\n","Creating Madison\n","Creating Lansing\n","Creating Augusta\n","Creating Montpelier\n","Creating Albany\n","Creating Boston\n","Creating Concord\n","Creating Tallahassee\n","Creating Jackson\n","Creating Nashville\n","Creating Montgomery\n","Creating Springfield\n","Creating Indianapolis\n","Creating Frankfort\n","Creating Columbus\n","Creating Charleston\n","Creating Atlanta\n","Creating Columbia\n","Creating Raleigh\n","Creating Richmond\n","Creating Harrisburg\n","Creating Trenton\n","Creating Dover\n","Creating Washington\n","Creating Annapolis\n","Creating Hartford\n","Creating Providence\n"]}],"source":["features_to_be_added = []\n","\n","# get a template feature object\n","template_feature = deepcopy(features_for_update[0])\n","\n","# loop through each row and add to the list of features to be added\n","for row in new_rows.iterrows():\n"," new_feature = deepcopy(template_feature)\n"," \n"," #print\n"," print(\"Creating \" + row[1]['name'])\n"," \n"," #get geometries in the destination coordinate system\n"," input_geometry = {'y':float(row[1]['latitude']),\n"," 'x':float(row[1]['longitude'])}\n"," output_geometry = geometry.project(geometries = [input_geometry],\n"," in_sr = 4326, \n"," out_sr = cities_fset.spatial_reference['latestWkid'],\n"," gis = gis)\n"," \n"," # assign the updated values\n"," new_feature.geometry = output_geometry[0]\n"," new_feature.attributes['longitude'] = float(row[1]['longitude'])\n"," new_feature.attributes['city_id'] = int(row[1]['city_id'])\n"," new_feature.attributes['state'] = row[1]['state']\n"," new_feature.attributes['capital'] = row[1]['capital']\n"," new_feature.attributes['latitude'] = float(row[1]['latitude'])\n"," new_feature.attributes['name'] = row[1]['name']\n"," new_feature.attributes['pop2000'] = int(row[1]['pop2000'])\n"," new_feature.attributes['pop2007'] = int(row[1]['pop2007'])\n"," \n"," #add this to the list of features to be updated\n"," features_to_be_added.append(new_feature)"]},{"cell_type":"code","execution_count":24,"metadata":{},"outputs":[{"data":{"text/plain":["{\"geometry\": {\"x\": -10145683.671555312, \"y\": 3562570.141302621}, \"attributes\": {\"objectid\": 4, \"city_id\": 20, \"name\": \"Baton Rouge\", \"state\": \"LA\", \"capital\": \"State\", \"pop2000\": 227818, \"pop2007\": 228810, \"longitude\": -91.14022709999999, \"latitude\": 30.45809113, \"SHAPE\": {\"x\": -13680394.263900002, \"y\": 5949000.547900006, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}}}"]},"execution_count":24,"metadata":{},"output_type":"execute_result"}],"source":["# take a look at one of the features we created\n","features_to_be_added[0]"]},{"cell_type":"markdown","metadata":{},"source":["Thus, we have created a `list` of `Feature` objects with appropriate attributes and geometries. Next, to add these new features to the feature layer, call the `edit_features()` method of the `FeatureLayer` object and pass the list of `Feature` objects to the `adds` parameter:"]},{"cell_type":"code","execution_count":25,"metadata":{},"outputs":[{"data":{"text/plain":["{'addResults': [{'objectId': 20, 'globalId': None, 'success': True},\n"," {'objectId': 21, 'globalId': None, 'success': True},\n"," {'objectId': 22, 'globalId': None, 'success': True},\n"," {'objectId': 23, 'globalId': None, 'success': True},\n"," {'objectId': 24, 'globalId': None, 'success': True},\n"," {'objectId': 25, 'globalId': None, 'success': True},\n"," {'objectId': 26, 'globalId': None, 'success': True},\n"," {'objectId': 27, 'globalId': None, 'success': True},\n"," {'objectId': 28, 'globalId': None, 'success': True},\n"," {'objectId': 29, 'globalId': None, 'success': True},\n"," {'objectId': 30, 'globalId': None, 'success': True},\n"," {'objectId': 31, 'globalId': None, 'success': True},\n"," {'objectId': 32, 'globalId': None, 'success': True},\n"," {'objectId': 33, 'globalId': None, 'success': True},\n"," {'objectId': 34, 'globalId': None, 'success': True},\n"," {'objectId': 35, 'globalId': None, 'success': True},\n"," {'objectId': 36, 'globalId': None, 'success': True},\n"," {'objectId': 37, 'globalId': None, 'success': True},\n"," {'objectId': 38, 'globalId': None, 'success': True},\n"," {'objectId': 39, 'globalId': None, 'success': True},\n"," {'objectId': 40, 'globalId': None, 'success': True},\n"," {'objectId': 41, 'globalId': None, 'success': True},\n"," {'objectId': 42, 'globalId': None, 'success': True},\n"," {'objectId': 43, 'globalId': None, 'success': True},\n"," {'objectId': 44, 'globalId': None, 'success': True},\n"," {'objectId': 45, 'globalId': None, 'success': True},\n"," {'objectId': 46, 'globalId': None, 'success': True},\n"," {'objectId': 47, 'globalId': None, 'success': True},\n"," {'objectId': 48, 'globalId': None, 'success': True},\n"," {'objectId': 49, 'globalId': None, 'success': True},\n"," {'objectId': 50, 'globalId': None, 'success': True},\n"," {'objectId': 51, 'globalId': None, 'success': True}],\n"," 'updateResults': [],\n"," 'deleteResults': [],\n"," 'attachments': {'addResults': [], 'updateResults': [], 'deleteResults': []}}"]},"execution_count":25,"metadata":{},"output_type":"execute_result"}],"source":["cities_flayer.edit_features(adds = features_to_be_added)"]},{"cell_type":"markdown","metadata":{},"source":["Thus, we have successfully applied edits from second csv file. Next let us look at how we can apply edits from third csv file.\n","\n","### Apply edits from third spreadsheet\n","The next set of updates have arrived and are stored in `capitals_annex.csv`. We are told it contains additional columns for each of the features that we want to add to the feature layer.\n","\n","To start with, let us read the third csv file. Note in this sample, data is stored in csv. In reality, it could be from your enterprise database or any other data source."]},{"cell_type":"code","execution_count":26,"metadata":{},"outputs":[{"data":{"text/html":["
\n","\n","\n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n","
city_idnameclasswhiteblackameri_esasianhawn_plhispanicmalesfemales
01HonoluluCensus Designated Place7309360386892075882545716229182628189029
12JuneauCity and Borough229692483496143811610401546915242
23Boise CityCity17120414371300387030284109201493773
34OlympiaCity36246805553247312518632031922195
45SalemCity113746175020643304643199736875268172
\n","
"],"text/plain":[" city_id name class white black ameri_es \\\n","0 1 Honolulu Census Designated Place 73093 6038 689 \n","1 2 Juneau City and Borough 22969 248 3496 \n","2 3 Boise City City 171204 1437 1300 \n","3 4 Olympia City 36246 805 553 \n","4 5 Salem City 113746 1750 2064 \n","\n"," asian hawn_pl hispanic males females \n","0 207588 25457 16229 182628 189029 \n","1 1438 116 1040 15469 15242 \n","2 3870 302 8410 92014 93773 \n","3 2473 125 1863 20319 22195 \n","4 3304 643 19973 68752 68172 "]},"execution_count":26,"metadata":{},"output_type":"execute_result"}],"source":["# read the third csv set\n","csv3 = 'data/updating_gis_content/capitals_annex.csv'\n","cities_df_3 = pd.read_csv(csv3)\n","cities_df_3.head()"]},{"cell_type":"code","execution_count":27,"metadata":{},"outputs":[{"data":{"text/plain":["(51, 11)"]},"execution_count":27,"metadata":{},"output_type":"execute_result"}],"source":["#find the number of rows in the third csv\n","cities_df_3.shape"]},{"cell_type":"markdown","metadata":{},"source":["The `capitals_annex.csv` does not add new features, instead it adds additional attribute columns to existing features. It has 51 rows which were found to match the 19 + 32 rows from first and second csv files. The columns `City_ID` and `NAME` are common to all 3 spreadsheets. Next let us take a look at how we can append this additional attribute information to our feature layer.\n","\n","#### Inspecting existing fields of the feature layer\n","The `manager` property of the `FeatureLayer` object exposes a set of methods to read and update the properties and definition of feature layers."]},{"cell_type":"code","execution_count":28,"metadata":{},"outputs":[{"data":{"text/plain":["{\n"," \"name\": \"city_id\",\n"," \"type\": \"esriFieldTypeInteger\",\n"," \"alias\": \"city_id\",\n"," \"domain\": null,\n"," \"editable\": true,\n"," \"nullable\": true,\n"," \"sqlType\": \"sqlTypeInteger\",\n"," \"length\": 10\n","}"]},"execution_count":28,"metadata":{},"output_type":"execute_result"}],"source":["#Get the existing list of fields on the cities feature layer\n","cities_fields = cities_flayer.manager.properties.fields\n","\n","# Your feature layer may have multiple fields, \n","# instead of printing all, let us take a look at one of the fields:\n","cities_fields[1]"]},{"cell_type":"markdown","metadata":{},"source":["From above, we can see the representation of one of the fields. Let us loop through each of the fields and print the `name`, `alias`, `type` and `sqlType` properties"]},{"cell_type":"code","execution_count":29,"metadata":{},"outputs":[{"name":"stdout","output_type":"stream","text":["objectid | objectid | esriFieldTypeOID | sqlTypeInteger\n","city_id | city_id | esriFieldTypeInteger | sqlTypeInteger\n","name | name | esriFieldTypeString | sqlTypeVarchar\n","state | state | esriFieldTypeString | sqlTypeVarchar\n","capital | capital | esriFieldTypeString | sqlTypeVarchar\n","pop2000 | pop2000 | esriFieldTypeInteger | sqlTypeInteger\n","pop2007 | pop2007 | esriFieldTypeInteger | sqlTypeInteger\n","longitude | longitude | esriFieldTypeDouble | sqlTypeNumeric\n","latitude | latitude | esriFieldTypeDouble | sqlTypeNumeric\n"]}],"source":["for field in cities_fields:\n"," print(f\"{field.name:13}| {field.alias:13}| {field.type:25}| {field.sqlType}\")"]},{"cell_type":"markdown","metadata":{},"source":["#### Preparing additional columns to add to the feature layer\n","Now that we have an idea of how the fields are defined, we can go ahead and append new fields to the layer's definition. Once we compose a list of new fields, by calling the `add_to_definition()` method we can push those changes to the feature layer. Once the feature layer's definition is updated with new fields, we can loop through each feature and add the appropriate attribute values.\n","\n","To compose a list of new fields to be added, we start by making a copy of one of the fields as a template and start editing it. One easy part in this example is, all new fields that need to be added except one, are of the same data type: integer. With your data, this may not be the case. In such instances, you can add each field individually."]},{"cell_type":"code","execution_count":30,"metadata":{},"outputs":[{"data":{"text/plain":["{'name': 'city_id',\n"," 'type': 'esriFieldTypeInteger',\n"," 'alias': 'city_id',\n"," 'domain': None,\n"," 'editable': True,\n"," 'nullable': True,\n"," 'sqlType': 'sqlTypeInteger',\n"," 'length': 10}"]},"execution_count":30,"metadata":{},"output_type":"execute_result"}],"source":["# get a template field\n","template_field = dict(deepcopy(cities_fields[1]))\n","template_field"]},{"cell_type":"markdown","metadata":{},"source":["Let us use pandas to get the list of fields that are **new** in spread sheet 3"]},{"cell_type":"code","execution_count":31,"metadata":{},"outputs":[{"data":{"text/plain":["['ameri_es',\n"," 'asian',\n"," 'black',\n"," 'class',\n"," 'females',\n"," 'hawn_pl',\n"," 'hispanic',\n"," 'males',\n"," 'white']"]},"execution_count":31,"metadata":{},"output_type":"execute_result"}],"source":["# get the list of new fields to add from the third spreadsheet, that are not in spread sheets 1,2\n","new_field_names = list(cities_df_3.columns.difference(cities_df_1.columns))\n","new_field_names"]},{"cell_type":"markdown","metadata":{},"source":["Now loop though each new field name and create a field dictionary using the template we created earlier. Except the field titled `class` all other fields are of type `integer`."]},{"cell_type":"code","execution_count":32,"metadata":{},"outputs":[{"data":{"text/plain":["9"]},"execution_count":32,"metadata":{},"output_type":"execute_result"}],"source":["fields_to_be_added = []\n","for new_field_name in new_field_names:\n"," current_field = deepcopy(template_field)\n"," if new_field_name.lower() == 'class':\n"," current_field['sqlType'] = 'sqlTypeVarchar'\n"," current_field['type'] = 'esriFieldTypeString'\n"," current_field['length'] = 8000\n"," current_field['name'] = new_field_name.lower()\n"," current_field['alias'] = new_field_name\n"," fields_to_be_added.append(current_field)\n"," \n","len(fields_to_be_added)"]},{"cell_type":"code","execution_count":33,"metadata":{},"outputs":[{"data":{"text/plain":["{'name': 'class',\n"," 'type': 'esriFieldTypeString',\n"," 'alias': 'class',\n"," 'domain': None,\n"," 'editable': True,\n"," 'nullable': True,\n"," 'sqlType': 'sqlTypeVarchar',\n"," 'length': 8000}"]},"execution_count":33,"metadata":{},"output_type":"execute_result"}],"source":["#inspect one of the fields\n","fields_to_be_added[3]"]},{"cell_type":"markdown","metadata":{},"source":["#### Adding additional fields to the feature layer\n","The list of new fields we composed can be pushed to the server by calling `add_to_definition()` method on the `manager` property."]},{"cell_type":"code","execution_count":34,"metadata":{},"outputs":[{"data":{"text/plain":["{'success': True}"]},"execution_count":34,"metadata":{},"output_type":"execute_result"}],"source":["cities_flayer.manager.add_to_definition({'fields':fields_to_be_added})"]},{"cell_type":"markdown","metadata":{},"source":["Thus, we have successfully added new fields to our feature layer. Let us verify the new columns show up:"]},{"cell_type":"code","execution_count":35,"metadata":{},"outputs":[{"data":{"text/plain":["18"]},"execution_count":35,"metadata":{},"output_type":"execute_result"}],"source":["new_cities_fields = cities_flayer.manager.properties.fields\n","len(new_cities_fields)"]},{"cell_type":"code","execution_count":36,"metadata":{},"outputs":[{"name":"stdout","output_type":"stream","text":["objectid | esriFieldTypeOID\n","city_id | esriFieldTypeInteger\n","name | esriFieldTypeString\n","state | esriFieldTypeString\n","capital | esriFieldTypeString\n","pop2000 | esriFieldTypeInteger\n","pop2007 | esriFieldTypeInteger\n","longitude | esriFieldTypeDouble\n","latitude | esriFieldTypeDouble\n","ameri_es | esriFieldTypeInteger\n","asian | esriFieldTypeInteger\n","black | esriFieldTypeInteger\n","class | esriFieldTypeString\n","females | esriFieldTypeInteger\n","hawn_pl | esriFieldTypeInteger\n","hispanic | esriFieldTypeInteger\n","males | esriFieldTypeInteger\n","white | esriFieldTypeInteger\n"]}],"source":["for field in new_cities_fields:\n"," print(f\"{field.name:10}| {field.type}\")"]},{"cell_type":"markdown","metadata":{},"source":["#### Adding attribute values to the new columns\n","Next we can loop through each row in the third csv and add the new attribute values for these newly created columns."]},{"cell_type":"code","execution_count":37,"metadata":{},"outputs":[],"source":["# Run a fresh query on the feature layer so it includes the new features from\n","# csv2 and new columns from csv3\n","cities_fset2 = cities_flayer.query()\n","cities_features2 = cities_fset2.features"]},{"cell_type":"markdown","metadata":{},"source":["Loop through each row in the third spreadsheet, find the corresponding feature by matching the `city_id` value and apply the attribute values for the new fields."]},{"cell_type":"code","execution_count":38,"metadata":{},"outputs":[{"name":"stdout","output_type":"stream","text":["1 Adding additional attributes for: Honolulu\n","2 Adding additional attributes for: Juneau\n","3 Adding additional attributes for: Boise City\n","4 Adding additional attributes for: Olympia\n","5 Adding additional attributes for: Salem\n","6 Adding additional attributes for: Carson\n","7 Adding additional attributes for: Sacramento\n","8 Adding additional attributes for: Phoenix\n","9 Adding additional attributes for: Salt Lake City\n","10 Adding additional attributes for: Cheyenne\n","11 Adding additional attributes for: Denver\n","12 Adding additional attributes for: Santa Fe\n","13 Adding additional attributes for: Oklahoma City\n","14 Adding additional attributes for: Topeka\n","15 Adding additional attributes for: Lincoln\n","16 Adding additional attributes for: Des Moines\n","17 Adding additional attributes for: Jefferson City\n","18 Adding additional attributes for: Little Rock\n","19 Adding additional attributes for: Austin\n","20 Adding additional attributes for: Baton Rouge\n","21 Adding additional attributes for: Helena\n","22 Adding additional attributes for: Bismarck\n","23 Adding additional attributes for: Pierre\n","24 Adding additional attributes for: St. Paul\n","25 Adding additional attributes for: Madison\n","26 Adding additional attributes for: Lansing\n","27 Adding additional attributes for: Augusta\n","28 Adding additional attributes for: Montpelier\n","29 Adding additional attributes for: Albany\n","30 Adding additional attributes for: Boston\n","31 Adding additional attributes for: Concord\n","32 Adding additional attributes for: Tallahassee\n","33 Adding additional attributes for: Jackson\n","34 Adding additional attributes for: Nashville\n","35 Adding additional attributes for: Montgomery\n","36 Adding additional attributes for: Springfield\n","37 Adding additional attributes for: Indianapolis\n","38 Adding additional attributes for: Frankfort\n","39 Adding additional attributes for: Columbus\n","40 Adding additional attributes for: Charleston\n","41 Adding additional attributes for: Atlanta\n","42 Adding additional attributes for: Columbia\n","43 Adding additional attributes for: Raleigh\n","44 Adding additional attributes for: Richmond\n","45 Adding additional attributes for: Harrisburg\n","46 Adding additional attributes for: Trenton\n","47 Adding additional attributes for: Dover\n","48 Adding additional attributes for: Washington\n","49 Adding additional attributes for: Annapolis\n","50 Adding additional attributes for: Hartford\n","51 Adding additional attributes for: Providence\n"]}],"source":["features_for_update = []\n","for city_id in cities_df_3['city_id']:\n"," # get the matching row from csv\n"," matching_row = cities_df_3.where(cities_df_3.city_id == city_id).dropna()\n"," \n"," print(str(city_id) + \" Adding additional attributes for: \" + matching_row['name'].values[0])\n"," \n"," # get the feature to be updated\n"," original_feature = [f for f in cities_features2 if f.attributes['city_id'] == city_id][0]\n"," feature_to_be_updated = deepcopy(original_feature)\n"," \n"," # assign the updated values\n"," feature_to_be_updated.attributes['class'] = matching_row['class'].values[0]\n"," feature_to_be_updated.attributes['white'] = int(matching_row['white'])\n"," feature_to_be_updated.attributes['black'] = int(matching_row['black'])\n"," feature_to_be_updated.attributes['ameri_es'] = int(matching_row['ameri_es'])\n"," feature_to_be_updated.attributes['asian'] = int(matching_row['asian'])\n"," feature_to_be_updated.attributes['hawn_pl'] = int(matching_row['hawn_pl'])\n"," feature_to_be_updated.attributes['hispanic'] = int(matching_row['hispanic'])\n"," feature_to_be_updated.attributes['males'] = int(matching_row['males'])\n"," feature_to_be_updated.attributes['females'] = int(matching_row['females'])\n"," \n"," #add this to the list of features to be updated\n"," features_for_update.append(feature_to_be_updated)"]},{"cell_type":"code","execution_count":39,"metadata":{},"outputs":[{"data":{"text/plain":["{\"geometry\": {\"x\": -7950674.8190312665, \"y\": 5134585.3226086125}, \"attributes\": {\"objectid\": 51, \"city_id\": 51, \"name\": \"Providence\", \"state\": \"RI\", \"capital\": \"State\", \"pop2000\": 173618, \"pop2007\": 183731, \"longitude\": -71.42212709, \"latitude\": 41.82355466, \"ameri_es\": 1975, \"asian\": 10432, \"black\": 25243, \"class\": \"City\", \"females\": 90583, \"hawn_pl\": 270, \"hispanic\": 52146, \"males\": 83035, \"white\": 94666}}"]},"execution_count":39,"metadata":{},"output_type":"execute_result"}],"source":["# inspect one of the features\n","features_for_update[-1]"]},{"cell_type":"code","execution_count":40,"metadata":{},"outputs":[{"data":{"text/plain":["{'addResults': [],\n"," 'updateResults': [{'objectId': 1, 'globalId': None, 'success': True},\n"," {'objectId': 2, 'globalId': None, 'success': True},\n"," {'objectId': 3, 'globalId': None, 'success': True},\n"," {'objectId': 4, 'globalId': None, 'success': True},\n"," {'objectId': 5, 'globalId': None, 'success': True},\n"," {'objectId': 6, 'globalId': None, 'success': True},\n"," {'objectId': 7, 'globalId': None, 'success': True},\n"," {'objectId': 8, 'globalId': None, 'success': True},\n"," {'objectId': 9, 'globalId': None, 'success': True},\n"," {'objectId': 10, 'globalId': None, 'success': True},\n"," {'objectId': 11, 'globalId': None, 'success': True},\n"," {'objectId': 12, 'globalId': None, 'success': True},\n"," {'objectId': 13, 'globalId': None, 'success': True},\n"," {'objectId': 14, 'globalId': None, 'success': True},\n"," {'objectId': 15, 'globalId': None, 'success': True},\n"," {'objectId': 16, 'globalId': None, 'success': True},\n"," {'objectId': 17, 'globalId': None, 'success': True},\n"," {'objectId': 18, 'globalId': None, 'success': True},\n"," {'objectId': 19, 'globalId': None, 'success': True},\n"," {'objectId': 20, 'globalId': None, 'success': True},\n"," {'objectId': 21, 'globalId': None, 'success': True},\n"," {'objectId': 22, 'globalId': None, 'success': True},\n"," {'objectId': 23, 'globalId': None, 'success': True},\n"," {'objectId': 24, 'globalId': None, 'success': True},\n"," {'objectId': 25, 'globalId': None, 'success': True},\n"," {'objectId': 26, 'globalId': None, 'success': True},\n"," {'objectId': 27, 'globalId': None, 'success': True},\n"," {'objectId': 28, 'globalId': None, 'success': True},\n"," {'objectId': 29, 'globalId': None, 'success': True},\n"," {'objectId': 30, 'globalId': None, 'success': True},\n"," {'objectId': 31, 'globalId': None, 'success': True},\n"," {'objectId': 32, 'globalId': None, 'success': True},\n"," {'objectId': 33, 'globalId': None, 'success': True},\n"," {'objectId': 34, 'globalId': None, 'success': True},\n"," {'objectId': 35, 'globalId': None, 'success': True},\n"," {'objectId': 36, 'globalId': None, 'success': True},\n"," {'objectId': 37, 'globalId': None, 'success': True},\n"," {'objectId': 38, 'globalId': None, 'success': True},\n"," {'objectId': 39, 'globalId': None, 'success': True},\n"," {'objectId': 40, 'globalId': None, 'success': True},\n"," {'objectId': 41, 'globalId': None, 'success': True},\n"," {'objectId': 42, 'globalId': None, 'success': True},\n"," {'objectId': 43, 'globalId': None, 'success': True},\n"," {'objectId': 44, 'globalId': None, 'success': True},\n"," {'objectId': 45, 'globalId': None, 'success': True},\n"," {'objectId': 46, 'globalId': None, 'success': True},\n"," {'objectId': 47, 'globalId': None, 'success': True},\n"," {'objectId': 48, 'globalId': None, 'success': True},\n"," {'objectId': 49, 'globalId': None, 'success': True},\n"," {'objectId': 50, 'globalId': None, 'success': True},\n"," {'objectId': 51, 'globalId': None, 'success': True}],\n"," 'deleteResults': [],\n"," 'attachments': {'addResults': [], 'updateResults': [], 'deleteResults': []}}"]},"execution_count":40,"metadata":{},"output_type":"execute_result"}],"source":["# apply the edits to the feature layer\n","cities_flayer.edit_features(updates= features_for_update)"]},{"cell_type":"markdown","metadata":{},"source":["#### Verify the changes made so far\n","Let us run another query on the feature layer and visualize a few rows."]},{"cell_type":"code","execution_count":41,"metadata":{},"outputs":[{"data":{"text/html":["
\n","\n","\n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n","
SHAPEameri_esasianblackcapitalcity_idclassfemaleshawn_plhispaniclatitudelongitudemalesnameobjectidpop2000pop2007statewhite
0{\"x\": -17568824.553, \"y\": 2428377.352700006, \"...6892075886038State1Census Designated Place189029254571622921.305782-157.823436182628Honolulu1371657378587HI73093
1{\"x\": -14973760.769500002, \"y\": 8041504.674200...34961438248State2City and Borough15242116104058.351418-134.51158215469Juneau23071131592AK22969
2{\"x\": -12939516.521100001, \"y\": 5405860.248099...130038701437State3City93773302841043.613736-116.23765592014Boise City3185787203529ID171204
3{\"x\": -13680394.263852797, \"y\": 5949000.547924...5532473805State4City22195125186347.042418-122.89307320319Olympia44251445523WA36246
4{\"x\": -13695542.842799995, \"y\": 5610682.544100...206433041750State5City681726431997344.931109-123.02915568752Salem5136924152039OR113746
\n","
"],"text/plain":[" SHAPE ameri_es asian black \\\n","0 {\"x\": -17568824.553, \"y\": 2428377.352700006, \"... 689 207588 6038 \n","1 {\"x\": -14973760.769500002, \"y\": 8041504.674200... 3496 1438 248 \n","2 {\"x\": -12939516.521100001, \"y\": 5405860.248099... 1300 3870 1437 \n","3 {\"x\": -13680394.263852797, \"y\": 5949000.547924... 553 2473 805 \n","4 {\"x\": -13695542.842799995, \"y\": 5610682.544100... 2064 3304 1750 \n","\n"," capital city_id class females hawn_pl hispanic \\\n","0 State 1 Census Designated Place 189029 25457 16229 \n","1 State 2 City and Borough 15242 116 1040 \n","2 State 3 City 93773 302 8410 \n","3 State 4 City 22195 125 1863 \n","4 State 5 City 68172 643 19973 \n","\n"," latitude longitude males name objectid pop2000 pop2007 \\\n","0 21.305782 -157.823436 182628 Honolulu 1 371657 378587 \n","1 58.351418 -134.511582 15469 Juneau 2 30711 31592 \n","2 43.613736 -116.237655 92014 Boise City 3 185787 203529 \n","3 47.042418 -122.893073 20319 Olympia 4 42514 45523 \n","4 44.931109 -123.029155 68752 Salem 5 136924 152039 \n","\n"," state white \n","0 HI 73093 \n","1 AK 22969 \n","2 ID 171204 \n","3 WA 36246 \n","4 OR 113746 "]},"execution_count":41,"metadata":{},"output_type":"execute_result"}],"source":["cities_fset3 = cities_flayer.query()\n","cities_fset3.sdf.head(5)"]},{"cell_type":"markdown","metadata":{},"source":["## Conclusion\n","In this sample, we observed an edit intensive method to keep feature layers updated. We published data from first spreadsheet as a feature layer. We then updated existing features from second spread sheet (used geometry module to project the coordinates in the process), and added new features. The third spreadsheet presented additional attribute columns which were added to the feature layer by editing its definition and then updating the features with this additional data.\n","\n","This method is editing intensive and you may choose this when the number of features to edit is less or if you needed to selectively update certain features as updates come in.\n","\n","An alternate method is to overwrite the feature layer altogether when you always have current information coming in. This method is explained in the sample [Overwriting feature layers](https://developers.arcgis.com/python/sample-notebooks/overwriting-feature-layers)"]}],"metadata":{"anaconda-cloud":{},"esriNotebookRuntime":{"notebookRuntimeName":"ArcGIS Notebook Python 3 Standard","notebookRuntimeVersion":"4.0"},"kernelspec":{"display_name":"Python 3","language":"python","name":"python3"},"language_info":{"codemirror_mode":{"name":"ipython","version":3},"file_extension":".py","mimetype":"text/x-python","name":"python","nbconvert_exporter":"python","pygments_lexer":"ipython3","version":"3.8.2"},"toc":{"base_numbering":1,"nav_menu":{},"number_sections":true,"sideBar":true,"skip_h1_title":false,"title_cell":"Table of Contents","title_sidebar":"Contents","toc_cell":true,"toc_position":{},"toc_section_display":true,"toc_window_display":true}},"nbformat":4,"nbformat_minor":1} From 5d4831f9943fa05a3dade563b01c0d8efe5480e6 Mon Sep 17 00:00:00 2001 From: NANAE AUBRY Date: Thu, 24 Oct 2024 16:17:14 +0200 Subject: [PATCH 2/3] udpates --- .../updating_features_in_a_feature_layer.ipynb | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/samples/05_content_publishers/updating_features_in_a_feature_layer.ipynb b/samples/05_content_publishers/updating_features_in_a_feature_layer.ipynb index 4c38e0617a..de21deee9d 100644 --- a/samples/05_content_publishers/updating_features_in_a_feature_layer.ipynb +++ b/samples/05_content_publishers/updating_features_in_a_feature_layer.ipynb @@ -1 +1 @@ -{"cells":[{"cell_type":"markdown","metadata":{},"source":["# Updating features in a feature layer\n","\n","As content publishers, you may be required to keep certain web layers up to date. As new data arrives, you may have to append new features, update existing features etc. There are a couple of different options to accomplish this:\n"," \n"," - Method 1: editing individual features as updated datasets are available\n"," - Method 2: overwriting feature layers altogether with updated datasets\n"," \n","Depending on the number of features that are updated, your workflow requirements, you may adopt either or both kinds of update mechanisms.\n","\n","In this sample, we explore the first method. For **Method 2**, refer to the sample titled [Overwriting feature layers](https://developers.arcgis.com/python/sample-notebooks/overwriting-feature-layers)\n","\n","**Note**: To run this sample, you need the ``pandas`` library in your conda environment. If you don't have the library, install it by running the following command from cmd.exe or your shell\n","```\n","conda install pandas```"]},{"cell_type":"markdown","metadata":{"toc":true},"source":["

Table of Contents

\n",""]},{"cell_type":"code","execution_count":3,"metadata":{},"outputs":[],"source":["# Connect to the GIS\n","from arcgis.gis import GIS\n","from arcgis import features\n","import pandas as pd"]},{"cell_type":"code","execution_count":4,"metadata":{},"outputs":[{"name":"stdout","output_type":"stream","text":["Enter password: ········\n"]}],"source":["#Access the portal using \"amazing_arcgis_123\" as password for the given Username. \n","gis = GIS(\"https://pythonapi.playground.esri.com/portal\")"]},{"cell_type":"markdown","metadata":{},"source":["## Updating feature layer by editing individual features\n","\n","Let us consider a scenario where we need to update a feature layer containing the capital cities of the US. We have 3 csv datasets simulating an update workflow as described below:\n","\n"," 1. capitals_1.csv -- contains the initial, incomplete dataset\n"," 2. capitals_2.csv -- contains additional points and updates to existing points, building on top of capitals_1.csv\n"," 3. capitals_annex.csv -- an alternate table containing additional attribute information\n"," \n","Our goal is to update the feature layer with each of these datasets doing the necessary edit operations."]},{"cell_type":"markdown","metadata":{},"source":["### Publish the cities feature layer using the initial dataset"]},{"cell_type":"code","execution_count":5,"metadata":{},"outputs":[{"data":{"text/html":["
\n","\n","\n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n","
city_idnamestatecapitalpop2000pop2007longitudelatitude
01HonoluluHIState371657378587-157.82343621.305782
12JuneauAKState3071131592-134.51158258.351418
23Boise CityIDState185787203529-116.23765543.613736
34OlympiaWAState2751445523-122.89307347.042418
45SalemORState136924152039-123.02915544.931109
\n","
"],"text/plain":[" city_id name state capital pop2000 pop2007 longitude latitude\n","0 1 Honolulu HI State 371657 378587 -157.823436 21.305782\n","1 2 Juneau AK State 30711 31592 -134.511582 58.351418\n","2 3 Boise City ID State 185787 203529 -116.237655 43.613736\n","3 4 Olympia WA State 27514 45523 -122.893073 47.042418\n","4 5 Salem OR State 136924 152039 -123.029155 44.931109"]},"execution_count":5,"metadata":{},"output_type":"execute_result"}],"source":["# read the initial csv\n","csv1 = 'data/updating_gis_content/capitals_1.csv'\n","cities_df_1 = pd.read_csv(csv1)\n","cities_df_1.head()"]},{"cell_type":"code","execution_count":6,"metadata":{},"outputs":[{"data":{"text/plain":["(19, 8)"]},"execution_count":6,"metadata":{},"output_type":"execute_result"}],"source":["# print the number of records in this csv\n","cities_df_1.shape"]},{"cell_type":"markdown","metadata":{},"source":["As you can see, this dataset only contains 19 rows or 19 capital cities. It is not the complete dataset."]},{"cell_type":"markdown","metadata":{},"source":["Let's add this `csv` as a portal item. Adding the item creates a CSV item and uploads the original file to the portal, establishing a link between the item and the original file name. Therefore, we need a unique name for the file to guarantee it does not collide with any file of the same name that may have been uploaded by the same user. We'll use standard library modules to copy the file and give it a new name so we can add it to the portal"]},{"cell_type":"code","execution_count":7,"metadata":{},"outputs":[],"source":["import os\n","import datetime as dt\n","import shutil\n","\n","# assign variables to locations on the file system \n","cwd = os.path.abspath(os.getcwd())\n","data_pth = os.path.join(cwd, r'data/updating_gis_content/')\n","\n","# create a unique timestamp string to append to the file name\n","now_ts = str(int(dt.datetime.now().timestamp()))\n","\n","# copy the file, appending the unique string and assign it to a variable\n","my_csv = shutil.copyfile(os.path.abspath(csv1),\n"," os.path.join(data_pth, 'capitals_1_' + now_ts + '.csv'))"]},{"cell_type":"code","execution_count":8,"metadata":{},"outputs":[{"data":{"text/plain":["'\\\\path\\\\on\\\\computer\\\\05_content_publishers\\\\data/updating_gis_content/capitals_1_1548875507.csv'"]},"execution_count":8,"metadata":{},"output_type":"execute_result"}],"source":["my_csv"]},{"cell_type":"code","execution_count":9,"metadata":{},"outputs":[{"data":{"text/html":["
\n","
\n"," \n"," \n"," \n","
\n","\n","
\n"," USA Capitals spreadsheet 1548875507\n"," \n","
CSV by arcgis_python\n","
Last Modified: January 30, 2019\n","
0 comments, 0 views\n","
\n","
\n"," "],"text/plain":[""]},"execution_count":9,"metadata":{},"output_type":"execute_result"}],"source":["# add the initial csv file and publish that as a web layer\n","item_prop = {'title':'USA Capitals spreadsheet ' + now_ts}\n","root_folder = gis.content.folders.get()\n","csv_item = root_folder.add(item_properties=item_prop, file=my_csv)\n","csv_item"]},{"cell_type":"markdown","metadata":{},"source":["This spreadsheet has co-ordinates as `latitude` and `longitude` columns which will be used for geometries during publishing."]},{"cell_type":"code","execution_count":10,"metadata":{},"outputs":[{"data":{"text/html":["
\n","
\n"," \n"," \n"," \n","
\n","\n","
\n"," USA Capitals spreadsheet 1548875507\n"," \n","
Feature Layer Collection by arcgis_python\n","
Last Modified: January 30, 2019\n","
0 comments, 0 views\n","
\n","
\n"," "],"text/plain":[""]},"execution_count":10,"metadata":{},"output_type":"execute_result"}],"source":["# publish the csv item into a feature layer\n","cities_item = csv_item.publish()\n","cities_item"]},{"cell_type":"code","execution_count":19,"metadata":{},"outputs":[{"data":{"text/html":["
\n","
\n"," \n"," \n"," \n","
\n","\n","
\n"," USA Capitals\n"," \n","
Feature Layer Collection by arcgis_python\n","
Last Modified: January 30, 2019\n","
0 comments, 0 views\n","
\n","
\n"," "],"text/plain":[""]},"execution_count":19,"metadata":{},"output_type":"execute_result"}],"source":["# update the item metadata\n","item_prop = {'title':'USA Capitals'}\n","cities_item.update(item_properties = item_prop, thumbnail='data/updating_gis_content/capital_cities.png')\n","cities_item"]},{"cell_type":"markdown","metadata":{},"source":["### Apply updates from the second spreadsheet\n","The next set of updates have arrived and are stored in `capitals_2.csv`. We are told it contains corrections for the original set in addition to new features. We need to figure out which rows have changed, apply `update` operation on those, then apply `add` operation to new rows.\n","\n","To start with, let us read the second csv file. Note, in this sample, data is stored in csv. In reality, it could be from your enterprise database or any other data source."]},{"cell_type":"code","execution_count":11,"metadata":{},"outputs":[{"data":{"text/html":["
\n","\n","\n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n","
city_idnamestatecapitalpop2000pop2007longitudelatitude
020Baton RougeLAState227818228810-91.14022730.458091
121HelenaMTState2578026007-112.02702746.595809
222BismarckNDState5553259344-100.77900046.813346
323PierreSDState1387614169-100.33638244.367964
424St. PaulMNState287151291643-93.11411844.954364
\n","
"],"text/plain":[" city_id name state capital pop2000 pop2007 longitude latitude\n","0 20 Baton Rouge LA State 227818 228810 -91.140227 30.458091\n","1 21 Helena MT State 25780 26007 -112.027027 46.595809\n","2 22 Bismarck ND State 55532 59344 -100.779000 46.813346\n","3 23 Pierre SD State 13876 14169 -100.336382 44.367964\n","4 24 St. Paul MN State 287151 291643 -93.114118 44.954364"]},"execution_count":11,"metadata":{},"output_type":"execute_result"}],"source":["# read the second csv set\n","csv2 = 'data/updating_gis_content/capitals_2.csv'\n","cities_df_2 = pd.read_csv(csv2)\n","cities_df_2.head()"]},{"cell_type":"code","execution_count":12,"metadata":{},"outputs":[{"data":{"text/plain":["(36, 8)"]},"execution_count":12,"metadata":{},"output_type":"execute_result"}],"source":["# get the dimensions of this csv\n","cities_df_2.shape"]},{"cell_type":"markdown","metadata":{},"source":["#### Identifying existing features that need to be updated\n","\n","To identify features that need to be updated, let us read the attribute table of the published feature layer and compare that against the second csv. To read the attribute table, we perform a `query()` on the feature layer which returns us an `arcgis.feature.FeatureSet` object. Refer to the guide pages on [accessing features from feature layers](https://developers.arcgis.com/python/guide/working-with-feature-layers-and-features/) to learn more about this.\n","\n","Note, at this point, we could work with the `cities_df_1` dataframe we created from the original csv file. However, in practice you may not always have the original dataset or your feature layer might have undergone edits after it was published. Hence, we query the feature layer directly."]},{"cell_type":"code","execution_count":13,"metadata":{},"outputs":[{"data":{"text/html":["
\n","\n","\n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n","
SHAPEcapitalcity_idlatitudelongitudenameobjectidpop2000pop2007state
0{\"x\": -17568824.553, \"y\": 2428377.352700006, \"...State121.305782-157.823436Honolulu1371657378587HI
1{\"x\": -14973760.769500002, \"y\": 8041504.674200...State258.351418-134.511582Juneau23071131592AK
2{\"x\": -12939516.521100001, \"y\": 5405860.248099...State343.613736-116.237655Boise City3185787203529ID
3{\"x\": -13680394.263900002, \"y\": 5949000.547900...State447.042418-122.893073Olympia42751445523WA
4{\"x\": -13695542.842799995, \"y\": 5610682.544100...State544.931109-123.029155Salem5136924152039OR
\n","
"],"text/plain":[" SHAPE capital city_id \\\n","0 {\"x\": -17568824.553, \"y\": 2428377.352700006, \"... State 1 \n","1 {\"x\": -14973760.769500002, \"y\": 8041504.674200... State 2 \n","2 {\"x\": -12939516.521100001, \"y\": 5405860.248099... State 3 \n","3 {\"x\": -13680394.263900002, \"y\": 5949000.547900... State 4 \n","4 {\"x\": -13695542.842799995, \"y\": 5610682.544100... State 5 \n","\n"," latitude longitude name objectid pop2000 pop2007 state \n","0 21.305782 -157.823436 Honolulu 1 371657 378587 HI \n","1 58.351418 -134.511582 Juneau 2 30711 31592 AK \n","2 43.613736 -116.237655 Boise City 3 185787 203529 ID \n","3 47.042418 -122.893073 Olympia 4 27514 45523 WA \n","4 44.931109 -123.029155 Salem 5 136924 152039 OR "]},"execution_count":13,"metadata":{},"output_type":"execute_result"}],"source":["cities_flayer = cities_item.layers[0]\n","cities_fset = cities_flayer.query() #querying without any conditions returns all the features\n","cities_fset.sdf.head()"]},{"cell_type":"markdown","metadata":{},"source":["The `city_id` column is common between both the datasets. Next, let us perform an `inner` join with the table from feature layer as left and updated csv as right. Inner joins will yield those rows that are present in both tables. Learn more about [inner joins here](https://www.w3schools.com/sql/sql_join_inner.asp)."]},{"cell_type":"code","execution_count":14,"metadata":{},"outputs":[{"data":{"text/html":["
\n","\n","\n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n","
SHAPEcapital_xcity_idlatitude_xlongitude_xname_xobjectidpop2000_xpop2007_xstate_xname_ystate_ycapital_ypop2000_ypop2007_ylongitude_ylatitude_y
0{'x': -13680394.263900002, 'y': 5949000.547900...State447.042418-122.893073Olympia42751445523WAOlympiaWAState4251445523-122.89307347.042418
1{'x': 11666510.350300007, 'y': -5033833.302499...State10-41.145545104.802046Cheyenne105301154750WYCheyenneWYState5301154750-104.80204641.145545
2{'x': -10857545.543799996, 'y': 4229619.674200...State1335.482309-97.534991Oklahoma City13506132552556OKKOklahoma CityOKState506132552556-97.53499135.482309
3{'x': -10421826.864700003, 'y': 5099899.263700...State1641.590936-93.620864Des Moines16200682201257IADes MoinesIAState198682201257-93.62086441.590936
\n","
"],"text/plain":[" SHAPE capital_x city_id \\\n","0 {'x': -13680394.263900002, 'y': 5949000.547900... State 4 \n","1 {'x': 11666510.350300007, 'y': -5033833.302499... State 10 \n","2 {'x': -10857545.543799996, 'y': 4229619.674200... State 13 \n","3 {'x': -10421826.864700003, 'y': 5099899.263700... State 16 \n","\n"," latitude_x longitude_x name_x objectid pop2000_x pop2007_x \\\n","0 47.042418 -122.893073 Olympia 4 27514 45523 \n","1 -41.145545 104.802046 Cheyenne 10 53011 54750 \n","2 35.482309 -97.534991 Oklahoma City 13 506132 552556 \n","3 41.590936 -93.620864 Des Moines 16 200682 201257 \n","\n"," state_x name_y state_y capital_y pop2000_y pop2007_y longitude_y \\\n","0 WA Olympia WA State 42514 45523 -122.893073 \n","1 WY Cheyenne WY State 53011 54750 -104.802046 \n","2 OKK Oklahoma City OK State 506132 552556 -97.534991 \n","3 IA Des Moines IA State 198682 201257 -93.620864 \n","\n"," latitude_y \n","0 47.042418 \n","1 41.145545 \n","2 35.482309 \n","3 41.590936 "]},"execution_count":14,"metadata":{},"output_type":"execute_result"}],"source":["overlap_rows = pd.merge(left = cities_fset.sdf, right = cities_df_2, how='inner',\n"," on = 'city_id')\n","overlap_rows"]},{"cell_type":"markdown","metadata":{},"source":["Thus, of 19 features in original and 36 features in second csv, 4 features are common. Inspecting the table, we find certain columns are updated, for instance, Cheyenne has its coordinates corrected, Oklahoma City has its state abbreviation corrected and similarly other cities have one of their attribute columns updated.\n","\n","We could either update individual attribute values for these 4 features or update all attribute values with the latest csv. Below, we are performing the latter as it is simple and fast.\n","\n","#### Perform updates to the existing features"]},{"cell_type":"code","execution_count":15,"metadata":{},"outputs":[],"source":["features_for_update = [] #list containing corrected features\n","all_features = cities_fset.features"]},{"cell_type":"code","execution_count":16,"metadata":{},"outputs":[{"data":{"text/plain":["{\"geometry\": {\"x\": -17568824.553, \"y\": 2428377.352700006, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}, \"attributes\": {\"objectid\": 1, \"city_id\": 1, \"name\": \"Honolulu\", \"state\": \"HI\", \"capital\": \"State\", \"pop2000\": 371657, \"pop2007\": 378587, \"longitude\": -157.8234362, \"latitude\": 21.30578163, \"SHAPE\": {\"x\": -17568824.553, \"y\": 2428377.352700006, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}}}"]},"execution_count":16,"metadata":{},"output_type":"execute_result"}],"source":["# inspect one of the features\n","all_features[0]"]},{"cell_type":"markdown","metadata":{},"source":["Note the X and Y geometry values are different from decimal degree coordinates present in Longitude and Latitude fields. To perform geometry edits, we need to project the coordinates to match that of the feature layer."]},{"cell_type":"code","execution_count":17,"metadata":{},"outputs":[{"data":{"text/plain":["{'wkid': 102100, 'latestWkid': 3857}"]},"execution_count":17,"metadata":{},"output_type":"execute_result"}],"source":["# get the spatial reference of the features since we need to update the geometry\n","cities_fset.spatial_reference"]},{"cell_type":"markdown","metadata":{},"source":["Below, we prepare updated geometries and attributes for each of the 4 features we determined above. We use the `arcgis.geometry` module to `project` the coordinates from geographic to projected coordinate system. The cell below prints the original `Feature` objects followed by the updated ones. If you look closely, you can find the differences."]},{"cell_type":"code","execution_count":18,"metadata":{},"outputs":[{"name":"stdout","output_type":"stream","text":["{\"geometry\": {\"x\": -13680394.263900002, \"y\": 5949000.547900006, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}, \"attributes\": {\"objectid\": 4, \"city_id\": 4, \"name\": \"Olympia\", \"state\": \"WA\", \"capital\": \"State\", \"pop2000\": 27514, \"pop2007\": 45523, \"longitude\": -122.8930726, \"latitude\": 47.04241817, \"SHAPE\": {\"x\": -13680394.263900002, \"y\": 5949000.547900006, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}}}\n","{\"geometry\": {\"x\": -13680394.2638528, \"y\": 5949000.54792491}, \"attributes\": {\"objectid\": 4, \"city_id\": 4, \"name\": \"Olympia\", \"state\": \"WA\", \"capital\": \"State\", \"pop2000\": 42514, \"pop2007\": 45523, \"longitude\": -122.8930726, \"latitude\": 47.04241817, \"SHAPE\": {\"x\": -13680394.263900002, \"y\": 5949000.547900006, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}}}\n","========================================================================\n","{\"geometry\": {\"x\": 11666510.350300007, \"y\": -5033833.302499998, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}, \"attributes\": {\"objectid\": 10, \"city_id\": 10, \"name\": \"Cheyenne\", \"state\": \"WY\", \"capital\": \"State\", \"pop2000\": 53011, \"pop2007\": 54750, \"longitude\": 104.8020456, \"latitude\": -41.14554516, \"SHAPE\": {\"x\": 11666510.350300007, \"y\": -5033833.302499998, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}}}\n","{\"geometry\": {\"x\": -11666510.350285435, \"y\": 5033833.302497153}, \"attributes\": {\"objectid\": 10, \"city_id\": 10, \"name\": \"Cheyenne\", \"state\": \"WY\", \"capital\": \"State\", \"pop2000\": 53011, \"pop2007\": 54750, \"longitude\": -104.80204559999999, \"latitude\": 41.14554516, \"SHAPE\": {\"x\": 11666510.350300007, \"y\": -5033833.302499998, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}}}\n","========================================================================\n","{\"geometry\": {\"x\": -10857545.543799996, \"y\": 4229619.674200006, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}, \"attributes\": {\"objectid\": 13, \"city_id\": 13, \"name\": \"Oklahoma City\", \"state\": \"OKK\", \"capital\": \"State\", \"pop2000\": 506132, \"pop2007\": 552556, \"longitude\": -97.5349911, \"latitude\": 35.48230867, \"SHAPE\": {\"x\": -10857545.543799996, \"y\": 4229619.674200006, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}}}\n","{\"geometry\": {\"x\": -10857545.54377847, \"y\": 4229619.674165817}, \"attributes\": {\"objectid\": 13, \"city_id\": 13, \"name\": \"Oklahoma City\", \"state\": \"OK\", \"capital\": \"State\", \"pop2000\": 506132, \"pop2007\": 552556, \"longitude\": -97.5349911, \"latitude\": 35.48230867, \"SHAPE\": {\"x\": -10857545.543799996, \"y\": 4229619.674200006, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}}}\n","========================================================================\n","{\"geometry\": {\"x\": -10421826.864700003, \"y\": 5099899.263700008, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}, \"attributes\": {\"objectid\": 16, \"city_id\": 16, \"name\": \"Des Moines\", \"state\": \"IA\", \"capital\": \"State\", \"pop2000\": 200682, \"pop2007\": 201257, \"longitude\": -93.62086361, \"latitude\": 41.59093617, \"SHAPE\": {\"x\": -10421826.864700003, \"y\": 5099899.263700008, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}}}\n","{\"geometry\": {\"x\": -10421826.864691716, \"y\": 5099899.263692743}, \"attributes\": {\"objectid\": 16, \"city_id\": 16, \"name\": \"Des Moines\", \"state\": \"IA\", \"capital\": \"State\", \"pop2000\": 198682, \"pop2007\": 201257, \"longitude\": -93.62086361, \"latitude\": 41.59093617, \"SHAPE\": {\"x\": -10421826.864700003, \"y\": 5099899.263700008, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}}}\n","========================================================================\n"]}],"source":["from arcgis import geometry #use geometry module to project Long,Lat to X and Y\n","from copy import deepcopy\n","\n","for city_id in overlap_rows['city_id']:\n"," # get the feature to be updated\n"," original_feature = [f for f in all_features if f.attributes['city_id'] == city_id][0]\n"," feature_to_be_updated = deepcopy(original_feature)\n"," \n"," print(str(original_feature))\n"," \n"," # get the matching row from csv\n"," matching_row = cities_df_2.where(cities_df_2.city_id == city_id).dropna()\n"," \n"," #get geometries in the destination coordinate system\n"," input_geometry = {'y':float(matching_row['latitude']),\n"," 'x':float(matching_row['longitude'])}\n"," output_geometry = geometry.project(geometries = [input_geometry],\n"," in_sr = 4326, \n"," out_sr = cities_fset.spatial_reference['latestWkid'],\n"," gis = gis)\n"," \n"," # assign the updated values\n"," feature_to_be_updated.geometry = output_geometry[0]\n"," feature_to_be_updated.attributes['longitude'] = float(matching_row['longitude'])\n"," feature_to_be_updated.attributes['city_id'] = int(matching_row['city_id'])\n"," feature_to_be_updated.attributes['state'] = matching_row['state'].values[0]\n"," feature_to_be_updated.attributes['capital'] = matching_row['capital'].values[0]\n"," feature_to_be_updated.attributes['latitude'] = float(matching_row['latitude'])\n"," feature_to_be_updated.attributes['name'] = matching_row['name'].values[0]\n"," feature_to_be_updated.attributes['pop2000'] = int(matching_row['pop2000'])\n"," feature_to_be_updated.attributes['pop2007'] = int(matching_row['pop2007'])\n"," \n"," #add this to the list of features to be updated\n"," features_for_update.append(feature_to_be_updated)\n"," \n"," print(str(feature_to_be_updated))\n"," print(\"========================================================================\")"]},{"cell_type":"markdown","metadata":{},"source":["We have constructed a list of features with updated values. We can use this list to perform updates on the feature layer."]},{"cell_type":"code","execution_count":19,"metadata":{},"outputs":[{"data":{"text/plain":["[{\"geometry\": {\"x\": -13680394.2638528, \"y\": 5949000.54792491}, \"attributes\": {\"objectid\": 4, \"city_id\": 4, \"name\": \"Olympia\", \"state\": \"WA\", \"capital\": \"State\", \"pop2000\": 42514, \"pop2007\": 45523, \"longitude\": -122.8930726, \"latitude\": 47.04241817, \"SHAPE\": {\"x\": -13680394.263900002, \"y\": 5949000.547900006, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}}},\n"," {\"geometry\": {\"x\": -11666510.350285435, \"y\": 5033833.302497153}, \"attributes\": {\"objectid\": 10, \"city_id\": 10, \"name\": \"Cheyenne\", \"state\": \"WY\", \"capital\": \"State\", \"pop2000\": 53011, \"pop2007\": 54750, \"longitude\": -104.80204559999999, \"latitude\": 41.14554516, \"SHAPE\": {\"x\": 11666510.350300007, \"y\": -5033833.302499998, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}}},\n"," {\"geometry\": {\"x\": -10857545.54377847, \"y\": 4229619.674165817}, \"attributes\": {\"objectid\": 13, \"city_id\": 13, \"name\": \"Oklahoma City\", \"state\": \"OK\", \"capital\": \"State\", \"pop2000\": 506132, \"pop2007\": 552556, \"longitude\": -97.5349911, \"latitude\": 35.48230867, \"SHAPE\": {\"x\": -10857545.543799996, \"y\": 4229619.674200006, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}}},\n"," {\"geometry\": {\"x\": -10421826.864691716, \"y\": 5099899.263692743}, \"attributes\": {\"objectid\": 16, \"city_id\": 16, \"name\": \"Des Moines\", \"state\": \"IA\", \"capital\": \"State\", \"pop2000\": 198682, \"pop2007\": 201257, \"longitude\": -93.62086361, \"latitude\": 41.59093617, \"SHAPE\": {\"x\": -10421826.864700003, \"y\": 5099899.263700008, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}}}]"]},"execution_count":19,"metadata":{},"output_type":"execute_result"}],"source":["features_for_update"]},{"cell_type":"markdown","metadata":{},"source":["To update the feature layer, call the `edit_features()` method of the `FeatureLayer` object and pass the list of features to the `updates` parameter:"]},{"cell_type":"code","execution_count":20,"metadata":{},"outputs":[{"data":{"text/plain":["{'addResults': [],\n"," 'updateResults': [{'objectId': 4, 'globalId': None, 'success': True},\n"," {'objectId': 10, 'globalId': None, 'success': True},\n"," {'objectId': 13, 'globalId': None, 'success': True},\n"," {'objectId': 16, 'globalId': None, 'success': True}],\n"," 'deleteResults': [],\n"," 'attachments': {'addResults': [], 'updateResults': [], 'deleteResults': []}}"]},"execution_count":20,"metadata":{},"output_type":"execute_result"}],"source":["cities_flayer.edit_features(updates= features_for_update)"]},{"cell_type":"markdown","metadata":{},"source":["We have successfully applied corrections to those features which existed in the feature layer from the initial dataset. Next let us proceed to adding new features present only in the second csv file.\n","\n","#### Identifying new features that need to be added"]},{"cell_type":"code","execution_count":21,"metadata":{},"outputs":[{"name":"stdout","output_type":"stream","text":["(32, 8)\n"]}],"source":["#select those rows in the capitals_2.csv that do not overlap with those in capitals_1.csv\n","new_rows = cities_df_2[~cities_df_2['city_id'].isin(overlap_rows['city_id'])]\n","print(new_rows.shape)"]},{"cell_type":"code","execution_count":22,"metadata":{},"outputs":[{"data":{"text/html":["
\n","\n","\n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n","
city_idnamestatecapitalpop2000pop2007longitudelatitude
020Baton RougeLAState227818228810-91.14022730.458091
121HelenaMTState2578026007-112.02702746.595809
222BismarckNDState5553259344-100.77900046.813346
323PierreSDState1387614169-100.33638244.367964
424St. PaulMNState287151291643-93.11411844.954364
\n","
"],"text/plain":[" city_id name state capital pop2000 pop2007 longitude latitude\n","0 20 Baton Rouge LA State 227818 228810 -91.140227 30.458091\n","1 21 Helena MT State 25780 26007 -112.027027 46.595809\n","2 22 Bismarck ND State 55532 59344 -100.779000 46.813346\n","3 23 Pierre SD State 13876 14169 -100.336382 44.367964\n","4 24 St. Paul MN State 287151 291643 -93.114118 44.954364"]},"execution_count":22,"metadata":{},"output_type":"execute_result"}],"source":["new_rows.head()"]},{"cell_type":"markdown","metadata":{},"source":["Thus, of the total 36 rows in the second csv, we have determined the 32 other rows which are new and need to be appended as new features."]},{"cell_type":"markdown","metadata":{},"source":["#### Adding new features\n","Next, let us compose another `list` of `Feature` objects similar to earlier, from the `new_rows` data frame."]},{"cell_type":"code","execution_count":23,"metadata":{},"outputs":[{"name":"stdout","output_type":"stream","text":["Creating Baton Rouge\n","Creating Helena\n","Creating Bismarck\n","Creating Pierre\n","Creating St. Paul\n","Creating Madison\n","Creating Lansing\n","Creating Augusta\n","Creating Montpelier\n","Creating Albany\n","Creating Boston\n","Creating Concord\n","Creating Tallahassee\n","Creating Jackson\n","Creating Nashville\n","Creating Montgomery\n","Creating Springfield\n","Creating Indianapolis\n","Creating Frankfort\n","Creating Columbus\n","Creating Charleston\n","Creating Atlanta\n","Creating Columbia\n","Creating Raleigh\n","Creating Richmond\n","Creating Harrisburg\n","Creating Trenton\n","Creating Dover\n","Creating Washington\n","Creating Annapolis\n","Creating Hartford\n","Creating Providence\n"]}],"source":["features_to_be_added = []\n","\n","# get a template feature object\n","template_feature = deepcopy(features_for_update[0])\n","\n","# loop through each row and add to the list of features to be added\n","for row in new_rows.iterrows():\n"," new_feature = deepcopy(template_feature)\n"," \n"," #print\n"," print(\"Creating \" + row[1]['name'])\n"," \n"," #get geometries in the destination coordinate system\n"," input_geometry = {'y':float(row[1]['latitude']),\n"," 'x':float(row[1]['longitude'])}\n"," output_geometry = geometry.project(geometries = [input_geometry],\n"," in_sr = 4326, \n"," out_sr = cities_fset.spatial_reference['latestWkid'],\n"," gis = gis)\n"," \n"," # assign the updated values\n"," new_feature.geometry = output_geometry[0]\n"," new_feature.attributes['longitude'] = float(row[1]['longitude'])\n"," new_feature.attributes['city_id'] = int(row[1]['city_id'])\n"," new_feature.attributes['state'] = row[1]['state']\n"," new_feature.attributes['capital'] = row[1]['capital']\n"," new_feature.attributes['latitude'] = float(row[1]['latitude'])\n"," new_feature.attributes['name'] = row[1]['name']\n"," new_feature.attributes['pop2000'] = int(row[1]['pop2000'])\n"," new_feature.attributes['pop2007'] = int(row[1]['pop2007'])\n"," \n"," #add this to the list of features to be updated\n"," features_to_be_added.append(new_feature)"]},{"cell_type":"code","execution_count":24,"metadata":{},"outputs":[{"data":{"text/plain":["{\"geometry\": {\"x\": -10145683.671555312, \"y\": 3562570.141302621}, \"attributes\": {\"objectid\": 4, \"city_id\": 20, \"name\": \"Baton Rouge\", \"state\": \"LA\", \"capital\": \"State\", \"pop2000\": 227818, \"pop2007\": 228810, \"longitude\": -91.14022709999999, \"latitude\": 30.45809113, \"SHAPE\": {\"x\": -13680394.263900002, \"y\": 5949000.547900006, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}}}"]},"execution_count":24,"metadata":{},"output_type":"execute_result"}],"source":["# take a look at one of the features we created\n","features_to_be_added[0]"]},{"cell_type":"markdown","metadata":{},"source":["Thus, we have created a `list` of `Feature` objects with appropriate attributes and geometries. Next, to add these new features to the feature layer, call the `edit_features()` method of the `FeatureLayer` object and pass the list of `Feature` objects to the `adds` parameter:"]},{"cell_type":"code","execution_count":25,"metadata":{},"outputs":[{"data":{"text/plain":["{'addResults': [{'objectId': 20, 'globalId': None, 'success': True},\n"," {'objectId': 21, 'globalId': None, 'success': True},\n"," {'objectId': 22, 'globalId': None, 'success': True},\n"," {'objectId': 23, 'globalId': None, 'success': True},\n"," {'objectId': 24, 'globalId': None, 'success': True},\n"," {'objectId': 25, 'globalId': None, 'success': True},\n"," {'objectId': 26, 'globalId': None, 'success': True},\n"," {'objectId': 27, 'globalId': None, 'success': True},\n"," {'objectId': 28, 'globalId': None, 'success': True},\n"," {'objectId': 29, 'globalId': None, 'success': True},\n"," {'objectId': 30, 'globalId': None, 'success': True},\n"," {'objectId': 31, 'globalId': None, 'success': True},\n"," {'objectId': 32, 'globalId': None, 'success': True},\n"," {'objectId': 33, 'globalId': None, 'success': True},\n"," {'objectId': 34, 'globalId': None, 'success': True},\n"," {'objectId': 35, 'globalId': None, 'success': True},\n"," {'objectId': 36, 'globalId': None, 'success': True},\n"," {'objectId': 37, 'globalId': None, 'success': True},\n"," {'objectId': 38, 'globalId': None, 'success': True},\n"," {'objectId': 39, 'globalId': None, 'success': True},\n"," {'objectId': 40, 'globalId': None, 'success': True},\n"," {'objectId': 41, 'globalId': None, 'success': True},\n"," {'objectId': 42, 'globalId': None, 'success': True},\n"," {'objectId': 43, 'globalId': None, 'success': True},\n"," {'objectId': 44, 'globalId': None, 'success': True},\n"," {'objectId': 45, 'globalId': None, 'success': True},\n"," {'objectId': 46, 'globalId': None, 'success': True},\n"," {'objectId': 47, 'globalId': None, 'success': True},\n"," {'objectId': 48, 'globalId': None, 'success': True},\n"," {'objectId': 49, 'globalId': None, 'success': True},\n"," {'objectId': 50, 'globalId': None, 'success': True},\n"," {'objectId': 51, 'globalId': None, 'success': True}],\n"," 'updateResults': [],\n"," 'deleteResults': [],\n"," 'attachments': {'addResults': [], 'updateResults': [], 'deleteResults': []}}"]},"execution_count":25,"metadata":{},"output_type":"execute_result"}],"source":["cities_flayer.edit_features(adds = features_to_be_added)"]},{"cell_type":"markdown","metadata":{},"source":["Thus, we have successfully applied edits from second csv file. Next let us look at how we can apply edits from third csv file.\n","\n","### Apply edits from third spreadsheet\n","The next set of updates have arrived and are stored in `capitals_annex.csv`. We are told it contains additional columns for each of the features that we want to add to the feature layer.\n","\n","To start with, let us read the third csv file. Note in this sample, data is stored in csv. In reality, it could be from your enterprise database or any other data source."]},{"cell_type":"code","execution_count":26,"metadata":{},"outputs":[{"data":{"text/html":["
\n","\n","\n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n","
city_idnameclasswhiteblackameri_esasianhawn_plhispanicmalesfemales
01HonoluluCensus Designated Place7309360386892075882545716229182628189029
12JuneauCity and Borough229692483496143811610401546915242
23Boise CityCity17120414371300387030284109201493773
34OlympiaCity36246805553247312518632031922195
45SalemCity113746175020643304643199736875268172
\n","
"],"text/plain":[" city_id name class white black ameri_es \\\n","0 1 Honolulu Census Designated Place 73093 6038 689 \n","1 2 Juneau City and Borough 22969 248 3496 \n","2 3 Boise City City 171204 1437 1300 \n","3 4 Olympia City 36246 805 553 \n","4 5 Salem City 113746 1750 2064 \n","\n"," asian hawn_pl hispanic males females \n","0 207588 25457 16229 182628 189029 \n","1 1438 116 1040 15469 15242 \n","2 3870 302 8410 92014 93773 \n","3 2473 125 1863 20319 22195 \n","4 3304 643 19973 68752 68172 "]},"execution_count":26,"metadata":{},"output_type":"execute_result"}],"source":["# read the third csv set\n","csv3 = 'data/updating_gis_content/capitals_annex.csv'\n","cities_df_3 = pd.read_csv(csv3)\n","cities_df_3.head()"]},{"cell_type":"code","execution_count":27,"metadata":{},"outputs":[{"data":{"text/plain":["(51, 11)"]},"execution_count":27,"metadata":{},"output_type":"execute_result"}],"source":["#find the number of rows in the third csv\n","cities_df_3.shape"]},{"cell_type":"markdown","metadata":{},"source":["The `capitals_annex.csv` does not add new features, instead it adds additional attribute columns to existing features. It has 51 rows which were found to match the 19 + 32 rows from first and second csv files. The columns `City_ID` and `NAME` are common to all 3 spreadsheets. Next let us take a look at how we can append this additional attribute information to our feature layer.\n","\n","#### Inspecting existing fields of the feature layer\n","The `manager` property of the `FeatureLayer` object exposes a set of methods to read and update the properties and definition of feature layers."]},{"cell_type":"code","execution_count":28,"metadata":{},"outputs":[{"data":{"text/plain":["{\n"," \"name\": \"city_id\",\n"," \"type\": \"esriFieldTypeInteger\",\n"," \"alias\": \"city_id\",\n"," \"domain\": null,\n"," \"editable\": true,\n"," \"nullable\": true,\n"," \"sqlType\": \"sqlTypeInteger\",\n"," \"length\": 10\n","}"]},"execution_count":28,"metadata":{},"output_type":"execute_result"}],"source":["#Get the existing list of fields on the cities feature layer\n","cities_fields = cities_flayer.manager.properties.fields\n","\n","# Your feature layer may have multiple fields, \n","# instead of printing all, let us take a look at one of the fields:\n","cities_fields[1]"]},{"cell_type":"markdown","metadata":{},"source":["From above, we can see the representation of one of the fields. Let us loop through each of the fields and print the `name`, `alias`, `type` and `sqlType` properties"]},{"cell_type":"code","execution_count":29,"metadata":{},"outputs":[{"name":"stdout","output_type":"stream","text":["objectid | objectid | esriFieldTypeOID | sqlTypeInteger\n","city_id | city_id | esriFieldTypeInteger | sqlTypeInteger\n","name | name | esriFieldTypeString | sqlTypeVarchar\n","state | state | esriFieldTypeString | sqlTypeVarchar\n","capital | capital | esriFieldTypeString | sqlTypeVarchar\n","pop2000 | pop2000 | esriFieldTypeInteger | sqlTypeInteger\n","pop2007 | pop2007 | esriFieldTypeInteger | sqlTypeInteger\n","longitude | longitude | esriFieldTypeDouble | sqlTypeNumeric\n","latitude | latitude | esriFieldTypeDouble | sqlTypeNumeric\n"]}],"source":["for field in cities_fields:\n"," print(f\"{field.name:13}| {field.alias:13}| {field.type:25}| {field.sqlType}\")"]},{"cell_type":"markdown","metadata":{},"source":["#### Preparing additional columns to add to the feature layer\n","Now that we have an idea of how the fields are defined, we can go ahead and append new fields to the layer's definition. Once we compose a list of new fields, by calling the `add_to_definition()` method we can push those changes to the feature layer. Once the feature layer's definition is updated with new fields, we can loop through each feature and add the appropriate attribute values.\n","\n","To compose a list of new fields to be added, we start by making a copy of one of the fields as a template and start editing it. One easy part in this example is, all new fields that need to be added except one, are of the same data type: integer. With your data, this may not be the case. In such instances, you can add each field individually."]},{"cell_type":"code","execution_count":30,"metadata":{},"outputs":[{"data":{"text/plain":["{'name': 'city_id',\n"," 'type': 'esriFieldTypeInteger',\n"," 'alias': 'city_id',\n"," 'domain': None,\n"," 'editable': True,\n"," 'nullable': True,\n"," 'sqlType': 'sqlTypeInteger',\n"," 'length': 10}"]},"execution_count":30,"metadata":{},"output_type":"execute_result"}],"source":["# get a template field\n","template_field = dict(deepcopy(cities_fields[1]))\n","template_field"]},{"cell_type":"markdown","metadata":{},"source":["Let us use pandas to get the list of fields that are **new** in spread sheet 3"]},{"cell_type":"code","execution_count":31,"metadata":{},"outputs":[{"data":{"text/plain":["['ameri_es',\n"," 'asian',\n"," 'black',\n"," 'class',\n"," 'females',\n"," 'hawn_pl',\n"," 'hispanic',\n"," 'males',\n"," 'white']"]},"execution_count":31,"metadata":{},"output_type":"execute_result"}],"source":["# get the list of new fields to add from the third spreadsheet, that are not in spread sheets 1,2\n","new_field_names = list(cities_df_3.columns.difference(cities_df_1.columns))\n","new_field_names"]},{"cell_type":"markdown","metadata":{},"source":["Now loop though each new field name and create a field dictionary using the template we created earlier. Except the field titled `class` all other fields are of type `integer`."]},{"cell_type":"code","execution_count":32,"metadata":{},"outputs":[{"data":{"text/plain":["9"]},"execution_count":32,"metadata":{},"output_type":"execute_result"}],"source":["fields_to_be_added = []\n","for new_field_name in new_field_names:\n"," current_field = deepcopy(template_field)\n"," if new_field_name.lower() == 'class':\n"," current_field['sqlType'] = 'sqlTypeVarchar'\n"," current_field['type'] = 'esriFieldTypeString'\n"," current_field['length'] = 8000\n"," current_field['name'] = new_field_name.lower()\n"," current_field['alias'] = new_field_name\n"," fields_to_be_added.append(current_field)\n"," \n","len(fields_to_be_added)"]},{"cell_type":"code","execution_count":33,"metadata":{},"outputs":[{"data":{"text/plain":["{'name': 'class',\n"," 'type': 'esriFieldTypeString',\n"," 'alias': 'class',\n"," 'domain': None,\n"," 'editable': True,\n"," 'nullable': True,\n"," 'sqlType': 'sqlTypeVarchar',\n"," 'length': 8000}"]},"execution_count":33,"metadata":{},"output_type":"execute_result"}],"source":["#inspect one of the fields\n","fields_to_be_added[3]"]},{"cell_type":"markdown","metadata":{},"source":["#### Adding additional fields to the feature layer\n","The list of new fields we composed can be pushed to the server by calling `add_to_definition()` method on the `manager` property."]},{"cell_type":"code","execution_count":34,"metadata":{},"outputs":[{"data":{"text/plain":["{'success': True}"]},"execution_count":34,"metadata":{},"output_type":"execute_result"}],"source":["cities_flayer.manager.add_to_definition({'fields':fields_to_be_added})"]},{"cell_type":"markdown","metadata":{},"source":["Thus, we have successfully added new fields to our feature layer. Let us verify the new columns show up:"]},{"cell_type":"code","execution_count":35,"metadata":{},"outputs":[{"data":{"text/plain":["18"]},"execution_count":35,"metadata":{},"output_type":"execute_result"}],"source":["new_cities_fields = cities_flayer.manager.properties.fields\n","len(new_cities_fields)"]},{"cell_type":"code","execution_count":36,"metadata":{},"outputs":[{"name":"stdout","output_type":"stream","text":["objectid | esriFieldTypeOID\n","city_id | esriFieldTypeInteger\n","name | esriFieldTypeString\n","state | esriFieldTypeString\n","capital | esriFieldTypeString\n","pop2000 | esriFieldTypeInteger\n","pop2007 | esriFieldTypeInteger\n","longitude | esriFieldTypeDouble\n","latitude | esriFieldTypeDouble\n","ameri_es | esriFieldTypeInteger\n","asian | esriFieldTypeInteger\n","black | esriFieldTypeInteger\n","class | esriFieldTypeString\n","females | esriFieldTypeInteger\n","hawn_pl | esriFieldTypeInteger\n","hispanic | esriFieldTypeInteger\n","males | esriFieldTypeInteger\n","white | esriFieldTypeInteger\n"]}],"source":["for field in new_cities_fields:\n"," print(f\"{field.name:10}| {field.type}\")"]},{"cell_type":"markdown","metadata":{},"source":["#### Adding attribute values to the new columns\n","Next we can loop through each row in the third csv and add the new attribute values for these newly created columns."]},{"cell_type":"code","execution_count":37,"metadata":{},"outputs":[],"source":["# Run a fresh query on the feature layer so it includes the new features from\n","# csv2 and new columns from csv3\n","cities_fset2 = cities_flayer.query()\n","cities_features2 = cities_fset2.features"]},{"cell_type":"markdown","metadata":{},"source":["Loop through each row in the third spreadsheet, find the corresponding feature by matching the `city_id` value and apply the attribute values for the new fields."]},{"cell_type":"code","execution_count":38,"metadata":{},"outputs":[{"name":"stdout","output_type":"stream","text":["1 Adding additional attributes for: Honolulu\n","2 Adding additional attributes for: Juneau\n","3 Adding additional attributes for: Boise City\n","4 Adding additional attributes for: Olympia\n","5 Adding additional attributes for: Salem\n","6 Adding additional attributes for: Carson\n","7 Adding additional attributes for: Sacramento\n","8 Adding additional attributes for: Phoenix\n","9 Adding additional attributes for: Salt Lake City\n","10 Adding additional attributes for: Cheyenne\n","11 Adding additional attributes for: Denver\n","12 Adding additional attributes for: Santa Fe\n","13 Adding additional attributes for: Oklahoma City\n","14 Adding additional attributes for: Topeka\n","15 Adding additional attributes for: Lincoln\n","16 Adding additional attributes for: Des Moines\n","17 Adding additional attributes for: Jefferson City\n","18 Adding additional attributes for: Little Rock\n","19 Adding additional attributes for: Austin\n","20 Adding additional attributes for: Baton Rouge\n","21 Adding additional attributes for: Helena\n","22 Adding additional attributes for: Bismarck\n","23 Adding additional attributes for: Pierre\n","24 Adding additional attributes for: St. Paul\n","25 Adding additional attributes for: Madison\n","26 Adding additional attributes for: Lansing\n","27 Adding additional attributes for: Augusta\n","28 Adding additional attributes for: Montpelier\n","29 Adding additional attributes for: Albany\n","30 Adding additional attributes for: Boston\n","31 Adding additional attributes for: Concord\n","32 Adding additional attributes for: Tallahassee\n","33 Adding additional attributes for: Jackson\n","34 Adding additional attributes for: Nashville\n","35 Adding additional attributes for: Montgomery\n","36 Adding additional attributes for: Springfield\n","37 Adding additional attributes for: Indianapolis\n","38 Adding additional attributes for: Frankfort\n","39 Adding additional attributes for: Columbus\n","40 Adding additional attributes for: Charleston\n","41 Adding additional attributes for: Atlanta\n","42 Adding additional attributes for: Columbia\n","43 Adding additional attributes for: Raleigh\n","44 Adding additional attributes for: Richmond\n","45 Adding additional attributes for: Harrisburg\n","46 Adding additional attributes for: Trenton\n","47 Adding additional attributes for: Dover\n","48 Adding additional attributes for: Washington\n","49 Adding additional attributes for: Annapolis\n","50 Adding additional attributes for: Hartford\n","51 Adding additional attributes for: Providence\n"]}],"source":["features_for_update = []\n","for city_id in cities_df_3['city_id']:\n"," # get the matching row from csv\n"," matching_row = cities_df_3.where(cities_df_3.city_id == city_id).dropna()\n"," \n"," print(str(city_id) + \" Adding additional attributes for: \" + matching_row['name'].values[0])\n"," \n"," # get the feature to be updated\n"," original_feature = [f for f in cities_features2 if f.attributes['city_id'] == city_id][0]\n"," feature_to_be_updated = deepcopy(original_feature)\n"," \n"," # assign the updated values\n"," feature_to_be_updated.attributes['class'] = matching_row['class'].values[0]\n"," feature_to_be_updated.attributes['white'] = int(matching_row['white'])\n"," feature_to_be_updated.attributes['black'] = int(matching_row['black'])\n"," feature_to_be_updated.attributes['ameri_es'] = int(matching_row['ameri_es'])\n"," feature_to_be_updated.attributes['asian'] = int(matching_row['asian'])\n"," feature_to_be_updated.attributes['hawn_pl'] = int(matching_row['hawn_pl'])\n"," feature_to_be_updated.attributes['hispanic'] = int(matching_row['hispanic'])\n"," feature_to_be_updated.attributes['males'] = int(matching_row['males'])\n"," feature_to_be_updated.attributes['females'] = int(matching_row['females'])\n"," \n"," #add this to the list of features to be updated\n"," features_for_update.append(feature_to_be_updated)"]},{"cell_type":"code","execution_count":39,"metadata":{},"outputs":[{"data":{"text/plain":["{\"geometry\": {\"x\": -7950674.8190312665, \"y\": 5134585.3226086125}, \"attributes\": {\"objectid\": 51, \"city_id\": 51, \"name\": \"Providence\", \"state\": \"RI\", \"capital\": \"State\", \"pop2000\": 173618, \"pop2007\": 183731, \"longitude\": -71.42212709, \"latitude\": 41.82355466, \"ameri_es\": 1975, \"asian\": 10432, \"black\": 25243, \"class\": \"City\", \"females\": 90583, \"hawn_pl\": 270, \"hispanic\": 52146, \"males\": 83035, \"white\": 94666}}"]},"execution_count":39,"metadata":{},"output_type":"execute_result"}],"source":["# inspect one of the features\n","features_for_update[-1]"]},{"cell_type":"code","execution_count":40,"metadata":{},"outputs":[{"data":{"text/plain":["{'addResults': [],\n"," 'updateResults': [{'objectId': 1, 'globalId': None, 'success': True},\n"," {'objectId': 2, 'globalId': None, 'success': True},\n"," {'objectId': 3, 'globalId': None, 'success': True},\n"," {'objectId': 4, 'globalId': None, 'success': True},\n"," {'objectId': 5, 'globalId': None, 'success': True},\n"," {'objectId': 6, 'globalId': None, 'success': True},\n"," {'objectId': 7, 'globalId': None, 'success': True},\n"," {'objectId': 8, 'globalId': None, 'success': True},\n"," {'objectId': 9, 'globalId': None, 'success': True},\n"," {'objectId': 10, 'globalId': None, 'success': True},\n"," {'objectId': 11, 'globalId': None, 'success': True},\n"," {'objectId': 12, 'globalId': None, 'success': True},\n"," {'objectId': 13, 'globalId': None, 'success': True},\n"," {'objectId': 14, 'globalId': None, 'success': True},\n"," {'objectId': 15, 'globalId': None, 'success': True},\n"," {'objectId': 16, 'globalId': None, 'success': True},\n"," {'objectId': 17, 'globalId': None, 'success': True},\n"," {'objectId': 18, 'globalId': None, 'success': True},\n"," {'objectId': 19, 'globalId': None, 'success': True},\n"," {'objectId': 20, 'globalId': None, 'success': True},\n"," {'objectId': 21, 'globalId': None, 'success': True},\n"," {'objectId': 22, 'globalId': None, 'success': True},\n"," {'objectId': 23, 'globalId': None, 'success': True},\n"," {'objectId': 24, 'globalId': None, 'success': True},\n"," {'objectId': 25, 'globalId': None, 'success': True},\n"," {'objectId': 26, 'globalId': None, 'success': True},\n"," {'objectId': 27, 'globalId': None, 'success': True},\n"," {'objectId': 28, 'globalId': None, 'success': True},\n"," {'objectId': 29, 'globalId': None, 'success': True},\n"," {'objectId': 30, 'globalId': None, 'success': True},\n"," {'objectId': 31, 'globalId': None, 'success': True},\n"," {'objectId': 32, 'globalId': None, 'success': True},\n"," {'objectId': 33, 'globalId': None, 'success': True},\n"," {'objectId': 34, 'globalId': None, 'success': True},\n"," {'objectId': 35, 'globalId': None, 'success': True},\n"," {'objectId': 36, 'globalId': None, 'success': True},\n"," {'objectId': 37, 'globalId': None, 'success': True},\n"," {'objectId': 38, 'globalId': None, 'success': True},\n"," {'objectId': 39, 'globalId': None, 'success': True},\n"," {'objectId': 40, 'globalId': None, 'success': True},\n"," {'objectId': 41, 'globalId': None, 'success': True},\n"," {'objectId': 42, 'globalId': None, 'success': True},\n"," {'objectId': 43, 'globalId': None, 'success': True},\n"," {'objectId': 44, 'globalId': None, 'success': True},\n"," {'objectId': 45, 'globalId': None, 'success': True},\n"," {'objectId': 46, 'globalId': None, 'success': True},\n"," {'objectId': 47, 'globalId': None, 'success': True},\n"," {'objectId': 48, 'globalId': None, 'success': True},\n"," {'objectId': 49, 'globalId': None, 'success': True},\n"," {'objectId': 50, 'globalId': None, 'success': True},\n"," {'objectId': 51, 'globalId': None, 'success': True}],\n"," 'deleteResults': [],\n"," 'attachments': {'addResults': [], 'updateResults': [], 'deleteResults': []}}"]},"execution_count":40,"metadata":{},"output_type":"execute_result"}],"source":["# apply the edits to the feature layer\n","cities_flayer.edit_features(updates= features_for_update)"]},{"cell_type":"markdown","metadata":{},"source":["#### Verify the changes made so far\n","Let us run another query on the feature layer and visualize a few rows."]},{"cell_type":"code","execution_count":41,"metadata":{},"outputs":[{"data":{"text/html":["
\n","\n","\n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n","
SHAPEameri_esasianblackcapitalcity_idclassfemaleshawn_plhispaniclatitudelongitudemalesnameobjectidpop2000pop2007statewhite
0{\"x\": -17568824.553, \"y\": 2428377.352700006, \"...6892075886038State1Census Designated Place189029254571622921.305782-157.823436182628Honolulu1371657378587HI73093
1{\"x\": -14973760.769500002, \"y\": 8041504.674200...34961438248State2City and Borough15242116104058.351418-134.51158215469Juneau23071131592AK22969
2{\"x\": -12939516.521100001, \"y\": 5405860.248099...130038701437State3City93773302841043.613736-116.23765592014Boise City3185787203529ID171204
3{\"x\": -13680394.263852797, \"y\": 5949000.547924...5532473805State4City22195125186347.042418-122.89307320319Olympia44251445523WA36246
4{\"x\": -13695542.842799995, \"y\": 5610682.544100...206433041750State5City681726431997344.931109-123.02915568752Salem5136924152039OR113746
\n","
"],"text/plain":[" SHAPE ameri_es asian black \\\n","0 {\"x\": -17568824.553, \"y\": 2428377.352700006, \"... 689 207588 6038 \n","1 {\"x\": -14973760.769500002, \"y\": 8041504.674200... 3496 1438 248 \n","2 {\"x\": -12939516.521100001, \"y\": 5405860.248099... 1300 3870 1437 \n","3 {\"x\": -13680394.263852797, \"y\": 5949000.547924... 553 2473 805 \n","4 {\"x\": -13695542.842799995, \"y\": 5610682.544100... 2064 3304 1750 \n","\n"," capital city_id class females hawn_pl hispanic \\\n","0 State 1 Census Designated Place 189029 25457 16229 \n","1 State 2 City and Borough 15242 116 1040 \n","2 State 3 City 93773 302 8410 \n","3 State 4 City 22195 125 1863 \n","4 State 5 City 68172 643 19973 \n","\n"," latitude longitude males name objectid pop2000 pop2007 \\\n","0 21.305782 -157.823436 182628 Honolulu 1 371657 378587 \n","1 58.351418 -134.511582 15469 Juneau 2 30711 31592 \n","2 43.613736 -116.237655 92014 Boise City 3 185787 203529 \n","3 47.042418 -122.893073 20319 Olympia 4 42514 45523 \n","4 44.931109 -123.029155 68752 Salem 5 136924 152039 \n","\n"," state white \n","0 HI 73093 \n","1 AK 22969 \n","2 ID 171204 \n","3 WA 36246 \n","4 OR 113746 "]},"execution_count":41,"metadata":{},"output_type":"execute_result"}],"source":["cities_fset3 = cities_flayer.query()\n","cities_fset3.sdf.head(5)"]},{"cell_type":"markdown","metadata":{},"source":["## Conclusion\n","In this sample, we observed an edit intensive method to keep feature layers updated. We published data from first spreadsheet as a feature layer. We then updated existing features from second spread sheet (used geometry module to project the coordinates in the process), and added new features. The third spreadsheet presented additional attribute columns which were added to the feature layer by editing its definition and then updating the features with this additional data.\n","\n","This method is editing intensive and you may choose this when the number of features to edit is less or if you needed to selectively update certain features as updates come in.\n","\n","An alternate method is to overwrite the feature layer altogether when you always have current information coming in. This method is explained in the sample [Overwriting feature layers](https://developers.arcgis.com/python/sample-notebooks/overwriting-feature-layers)"]}],"metadata":{"anaconda-cloud":{},"esriNotebookRuntime":{"notebookRuntimeName":"ArcGIS Notebook Python 3 Standard","notebookRuntimeVersion":"4.0"},"kernelspec":{"display_name":"Python 3","language":"python","name":"python3"},"language_info":{"codemirror_mode":{"name":"ipython","version":3},"file_extension":".py","mimetype":"text/x-python","name":"python","nbconvert_exporter":"python","pygments_lexer":"ipython3","version":"3.8.2"},"toc":{"base_numbering":1,"nav_menu":{},"number_sections":true,"sideBar":true,"skip_h1_title":false,"title_cell":"Table of Contents","title_sidebar":"Contents","toc_cell":true,"toc_position":{},"toc_section_display":true,"toc_window_display":true}},"nbformat":4,"nbformat_minor":1} +{"cells":[{"cell_type":"markdown","metadata":{},"source":["# Updating features in a feature layer\n","\n","As content publishers, you may be required to keep certain web layers up to date. As new data arrives, you may have to append new features, update existing features etc. There are a couple of different options to accomplish this:\n"," \n"," - Method 1: editing individual features as updated datasets are available\n"," - Method 2: overwriting feature layers altogether with updated datasets\n"," \n","Depending on the number of features that are updated, your workflow requirements, you may adopt either or both kinds of update mechanisms.\n","\n","In this sample, we explore the first method. For **Method 2**, refer to the sample titled [Overwriting feature layers](https://developers.arcgis.com/python/sample-notebooks/overwriting-feature-layers)\n","\n","**Note**: To run this sample, you need the ``pandas`` library in your conda environment. If you don't have the library, install it by running the following command from cmd.exe or your shell\n","```\n","conda install pandas```"]},{"cell_type":"markdown","metadata":{"toc":true},"source":["

Table of Contents

\n",""]},{"cell_type":"code","execution_count":3,"metadata":{},"outputs":[],"source":["# Connect to the GIS\n","from arcgis.gis import GIS\n","from arcgis import features\n","import pandas as pd"]},{"cell_type":"code","execution_count":4,"metadata":{},"outputs":[{"name":"stdout","output_type":"stream","text":["Enter password: ········\n"]}],"source":["#Access the portal using \"amazing_arcgis_123\" as password for the given Username. \n","gis = GIS(\"https://pythonapi.playground.esri.com/portal\")"]},{"cell_type":"markdown","metadata":{},"source":["## Updating feature layer by editing individual features\n","\n","Let us consider a scenario where we need to update a feature layer containing the capital cities of the US. We have 3 csv datasets simulating an update workflow as described below:\n","\n"," 1. capitals_1.csv -- contains the initial, incomplete dataset\n"," 2. capitals_2.csv -- contains additional points and updates to existing points, building on top of capitals_1.csv\n"," 3. capitals_annex.csv -- an alternate table containing additional attribute information\n"," \n","Our goal is to update the feature layer with each of these datasets doing the necessary edit operations."]},{"cell_type":"markdown","metadata":{},"source":["### Publish the cities feature layer using the initial dataset"]},{"cell_type":"code","execution_count":5,"metadata":{},"outputs":[{"data":{"text/html":["
\n","\n","\n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n","
city_idnamestatecapitalpop2000pop2007longitudelatitude
01HonoluluHIState371657378587-157.82343621.305782
12JuneauAKState3071131592-134.51158258.351418
23Boise CityIDState185787203529-116.23765543.613736
34OlympiaWAState2751445523-122.89307347.042418
45SalemORState136924152039-123.02915544.931109
\n","
"],"text/plain":[" city_id name state capital pop2000 pop2007 longitude latitude\n","0 1 Honolulu HI State 371657 378587 -157.823436 21.305782\n","1 2 Juneau AK State 30711 31592 -134.511582 58.351418\n","2 3 Boise City ID State 185787 203529 -116.237655 43.613736\n","3 4 Olympia WA State 27514 45523 -122.893073 47.042418\n","4 5 Salem OR State 136924 152039 -123.029155 44.931109"]},"execution_count":5,"metadata":{},"output_type":"execute_result"}],"source":["# read the initial csv\n","csv1 = 'data/updating_gis_content/capitals_1.csv'\n","cities_df_1 = pd.read_csv(csv1)\n","cities_df_1.head()"]},{"cell_type":"code","execution_count":6,"metadata":{},"outputs":[{"data":{"text/plain":["(19, 8)"]},"execution_count":6,"metadata":{},"output_type":"execute_result"}],"source":["# print the number of records in this csv\n","cities_df_1.shape"]},{"cell_type":"markdown","metadata":{},"source":["As you can see, this dataset only contains 19 rows or 19 capital cities. It is not the complete dataset."]},{"cell_type":"markdown","metadata":{},"source":["Let's add this `csv` as a portal item. Adding the item creates a CSV item and uploads the original file to the portal, establishing a link between the item and the original file name. Therefore, we need a unique name for the file to guarantee it does not collide with any file of the same name that may have been uploaded by the same user. We'll use standard library modules to copy the file and give it a new name so we can add it to the portal"]},{"cell_type":"code","execution_count":7,"metadata":{},"outputs":[],"source":["import os\n","import datetime as dt\n","import shutil\n","\n","# assign variables to locations on the file system \n","cwd = os.path.abspath(os.getcwd())\n","data_pth = os.path.join(cwd, r'data/updating_gis_content/')\n","\n","# create a unique timestamp string to append to the file name\n","now_ts = str(int(dt.datetime.now().timestamp()))\n","\n","# copy the file, appending the unique string and assign it to a variable\n","my_csv = shutil.copyfile(os.path.abspath(csv1),\n"," os.path.join(data_pth, 'capitals_1_' + now_ts + '.csv'))"]},{"cell_type":"code","execution_count":8,"metadata":{},"outputs":[{"data":{"text/plain":["'\\\\path\\\\on\\\\computer\\\\05_content_publishers\\\\data/updating_gis_content/capitals_1_1548875507.csv'"]},"execution_count":8,"metadata":{},"output_type":"execute_result"}],"source":["my_csv"]},{"cell_type":"code","execution_count":9,"metadata":{},"outputs":[{"data":{"text/html":["
\n","
\n"," \n"," \n"," \n","
\n","\n","
\n"," USA Capitals spreadsheet 1548875507\n"," \n","
CSV by arcgis_python\n","
Last Modified: January 30, 2019\n","
0 comments, 0 views\n","
\n","
\n"," "],"text/plain":[""]},"execution_count":9,"metadata":{},"output_type":"execute_result"}],"source":["# add the initial csv file and publish that as a web layer\n","item_prop = {'title':'USA Capitals spreadsheet ' + now_ts, \"item_type\": \"CSV\"}\n","root_folder = gis.content.folders.get()\n","csv_item = root_folder.add(item_properties=item_prop, file=my_csv).result()\n","csv_item"]},{"cell_type":"markdown","metadata":{},"source":["This spreadsheet has co-ordinates as `latitude` and `longitude` columns which will be used for geometries during publishing."]},{"cell_type":"code","execution_count":10,"metadata":{},"outputs":[{"data":{"text/html":["
\n","
\n"," \n"," \n"," \n","
\n","\n","
\n"," USA Capitals spreadsheet 1548875507\n"," \n","
Feature Layer Collection by arcgis_python\n","
Last Modified: January 30, 2019\n","
0 comments, 0 views\n","
\n","
\n"," "],"text/plain":[""]},"execution_count":10,"metadata":{},"output_type":"execute_result"}],"source":["# publish the csv item into a feature layer\n","cities_item = csv_item.publish()\n","cities_item"]},{"cell_type":"code","execution_count":19,"metadata":{},"outputs":[{"data":{"text/html":["
\n","
\n"," \n"," \n"," \n","
\n","\n","
\n"," USA Capitals\n"," \n","
Feature Layer Collection by arcgis_python\n","
Last Modified: January 30, 2019\n","
0 comments, 0 views\n","
\n","
\n"," "],"text/plain":[""]},"execution_count":19,"metadata":{},"output_type":"execute_result"}],"source":["# update the item metadata\n","item_prop = {'title':'USA Capitals'}\n","cities_item.update(item_properties = item_prop, thumbnail='data/updating_gis_content/capital_cities.png')\n","cities_item"]},{"cell_type":"markdown","metadata":{},"source":["### Apply updates from the second spreadsheet\n","The next set of updates have arrived and are stored in `capitals_2.csv`. We are told it contains corrections for the original set in addition to new features. We need to figure out which rows have changed, apply `update` operation on those, then apply `add` operation to new rows.\n","\n","To start with, let us read the second csv file. Note, in this sample, data is stored in csv. In reality, it could be from your enterprise database or any other data source."]},{"cell_type":"code","execution_count":11,"metadata":{},"outputs":[{"data":{"text/html":["
\n","\n","\n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n","
city_idnamestatecapitalpop2000pop2007longitudelatitude
020Baton RougeLAState227818228810-91.14022730.458091
121HelenaMTState2578026007-112.02702746.595809
222BismarckNDState5553259344-100.77900046.813346
323PierreSDState1387614169-100.33638244.367964
424St. PaulMNState287151291643-93.11411844.954364
\n","
"],"text/plain":[" city_id name state capital pop2000 pop2007 longitude latitude\n","0 20 Baton Rouge LA State 227818 228810 -91.140227 30.458091\n","1 21 Helena MT State 25780 26007 -112.027027 46.595809\n","2 22 Bismarck ND State 55532 59344 -100.779000 46.813346\n","3 23 Pierre SD State 13876 14169 -100.336382 44.367964\n","4 24 St. Paul MN State 287151 291643 -93.114118 44.954364"]},"execution_count":11,"metadata":{},"output_type":"execute_result"}],"source":["# read the second csv set\n","csv2 = 'data/updating_gis_content/capitals_2.csv'\n","cities_df_2 = pd.read_csv(csv2)\n","cities_df_2.head()"]},{"cell_type":"code","execution_count":12,"metadata":{},"outputs":[{"data":{"text/plain":["(36, 8)"]},"execution_count":12,"metadata":{},"output_type":"execute_result"}],"source":["# get the dimensions of this csv\n","cities_df_2.shape"]},{"cell_type":"markdown","metadata":{},"source":["#### Identifying existing features that need to be updated\n","\n","To identify features that need to be updated, let us read the attribute table of the published feature layer and compare that against the second csv. To read the attribute table, we perform a `query()` on the feature layer which returns us an `arcgis.feature.FeatureSet` object. Refer to the guide pages on [accessing features from feature layers](https://developers.arcgis.com/python/guide/working-with-feature-layers-and-features/) to learn more about this.\n","\n","Note, at this point, we could work with the `cities_df_1` dataframe we created from the original csv file. However, in practice you may not always have the original dataset or your feature layer might have undergone edits after it was published. Hence, we query the feature layer directly."]},{"cell_type":"code","execution_count":13,"metadata":{},"outputs":[{"data":{"text/html":["
\n","\n","\n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n","
SHAPEcapitalcity_idlatitudelongitudenameobjectidpop2000pop2007state
0{\"x\": -17568824.553, \"y\": 2428377.352700006, \"...State121.305782-157.823436Honolulu1371657378587HI
1{\"x\": -14973760.769500002, \"y\": 8041504.674200...State258.351418-134.511582Juneau23071131592AK
2{\"x\": -12939516.521100001, \"y\": 5405860.248099...State343.613736-116.237655Boise City3185787203529ID
3{\"x\": -13680394.263900002, \"y\": 5949000.547900...State447.042418-122.893073Olympia42751445523WA
4{\"x\": -13695542.842799995, \"y\": 5610682.544100...State544.931109-123.029155Salem5136924152039OR
\n","
"],"text/plain":[" SHAPE capital city_id \\\n","0 {\"x\": -17568824.553, \"y\": 2428377.352700006, \"... State 1 \n","1 {\"x\": -14973760.769500002, \"y\": 8041504.674200... State 2 \n","2 {\"x\": -12939516.521100001, \"y\": 5405860.248099... State 3 \n","3 {\"x\": -13680394.263900002, \"y\": 5949000.547900... State 4 \n","4 {\"x\": -13695542.842799995, \"y\": 5610682.544100... State 5 \n","\n"," latitude longitude name objectid pop2000 pop2007 state \n","0 21.305782 -157.823436 Honolulu 1 371657 378587 HI \n","1 58.351418 -134.511582 Juneau 2 30711 31592 AK \n","2 43.613736 -116.237655 Boise City 3 185787 203529 ID \n","3 47.042418 -122.893073 Olympia 4 27514 45523 WA \n","4 44.931109 -123.029155 Salem 5 136924 152039 OR "]},"execution_count":13,"metadata":{},"output_type":"execute_result"}],"source":["cities_flayer = cities_item.layers[0]\n","cities_fset = cities_flayer.query() #querying without any conditions returns all the features\n","cities_fset.sdf.head()"]},{"cell_type":"markdown","metadata":{},"source":["The `city_id` column is common between both the datasets. Next, let us perform an `inner` join with the table from feature layer as left and updated csv as right. Inner joins will yield those rows that are present in both tables. Learn more about [inner joins here](https://www.w3schools.com/sql/sql_join_inner.asp)."]},{"cell_type":"code","execution_count":14,"metadata":{},"outputs":[{"data":{"text/html":["
\n","\n","\n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n","
SHAPEcapital_xcity_idlatitude_xlongitude_xname_xobjectidpop2000_xpop2007_xstate_xname_ystate_ycapital_ypop2000_ypop2007_ylongitude_ylatitude_y
0{'x': -13680394.263900002, 'y': 5949000.547900...State447.042418-122.893073Olympia42751445523WAOlympiaWAState4251445523-122.89307347.042418
1{'x': 11666510.350300007, 'y': -5033833.302499...State10-41.145545104.802046Cheyenne105301154750WYCheyenneWYState5301154750-104.80204641.145545
2{'x': -10857545.543799996, 'y': 4229619.674200...State1335.482309-97.534991Oklahoma City13506132552556OKKOklahoma CityOKState506132552556-97.53499135.482309
3{'x': -10421826.864700003, 'y': 5099899.263700...State1641.590936-93.620864Des Moines16200682201257IADes MoinesIAState198682201257-93.62086441.590936
\n","
"],"text/plain":[" SHAPE capital_x city_id \\\n","0 {'x': -13680394.263900002, 'y': 5949000.547900... State 4 \n","1 {'x': 11666510.350300007, 'y': -5033833.302499... State 10 \n","2 {'x': -10857545.543799996, 'y': 4229619.674200... State 13 \n","3 {'x': -10421826.864700003, 'y': 5099899.263700... State 16 \n","\n"," latitude_x longitude_x name_x objectid pop2000_x pop2007_x \\\n","0 47.042418 -122.893073 Olympia 4 27514 45523 \n","1 -41.145545 104.802046 Cheyenne 10 53011 54750 \n","2 35.482309 -97.534991 Oklahoma City 13 506132 552556 \n","3 41.590936 -93.620864 Des Moines 16 200682 201257 \n","\n"," state_x name_y state_y capital_y pop2000_y pop2007_y longitude_y \\\n","0 WA Olympia WA State 42514 45523 -122.893073 \n","1 WY Cheyenne WY State 53011 54750 -104.802046 \n","2 OKK Oklahoma City OK State 506132 552556 -97.534991 \n","3 IA Des Moines IA State 198682 201257 -93.620864 \n","\n"," latitude_y \n","0 47.042418 \n","1 41.145545 \n","2 35.482309 \n","3 41.590936 "]},"execution_count":14,"metadata":{},"output_type":"execute_result"}],"source":["overlap_rows = pd.merge(left = cities_fset.sdf, right = cities_df_2, how='inner',\n"," on = 'city_id')\n","overlap_rows"]},{"cell_type":"markdown","metadata":{},"source":["Thus, of 19 features in original and 36 features in second csv, 4 features are common. Inspecting the table, we find certain columns are updated, for instance, Cheyenne has its coordinates corrected, Oklahoma City has its state abbreviation corrected and similarly other cities have one of their attribute columns updated.\n","\n","We could either update individual attribute values for these 4 features or update all attribute values with the latest csv. Below, we are performing the latter as it is simple and fast.\n","\n","#### Perform updates to the existing features"]},{"cell_type":"code","execution_count":15,"metadata":{},"outputs":[],"source":["features_for_update = [] #list containing corrected features\n","all_features = cities_fset.features"]},{"cell_type":"code","execution_count":16,"metadata":{},"outputs":[{"data":{"text/plain":["{\"geometry\": {\"x\": -17568824.553, \"y\": 2428377.352700006, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}, \"attributes\": {\"objectid\": 1, \"city_id\": 1, \"name\": \"Honolulu\", \"state\": \"HI\", \"capital\": \"State\", \"pop2000\": 371657, \"pop2007\": 378587, \"longitude\": -157.8234362, \"latitude\": 21.30578163, \"SHAPE\": {\"x\": -17568824.553, \"y\": 2428377.352700006, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}}}"]},"execution_count":16,"metadata":{},"output_type":"execute_result"}],"source":["# inspect one of the features\n","all_features[0]"]},{"cell_type":"markdown","metadata":{},"source":["Note the X and Y geometry values are different from decimal degree coordinates present in Longitude and Latitude fields. To perform geometry edits, we need to project the coordinates to match that of the feature layer."]},{"cell_type":"code","execution_count":17,"metadata":{},"outputs":[{"data":{"text/plain":["{'wkid': 102100, 'latestWkid': 3857}"]},"execution_count":17,"metadata":{},"output_type":"execute_result"}],"source":["# get the spatial reference of the features since we need to update the geometry\n","cities_fset.spatial_reference"]},{"cell_type":"markdown","metadata":{},"source":["Below, we prepare updated geometries and attributes for each of the 4 features we determined above. We use the `arcgis.geometry` module to `project` the coordinates from geographic to projected coordinate system. The cell below prints the original `Feature` objects followed by the updated ones. If you look closely, you can find the differences."]},{"cell_type":"code","execution_count":18,"metadata":{},"outputs":[{"name":"stdout","output_type":"stream","text":["{\"geometry\": {\"x\": -13680394.263900002, \"y\": 5949000.547900006, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}, \"attributes\": {\"objectid\": 4, \"city_id\": 4, \"name\": \"Olympia\", \"state\": \"WA\", \"capital\": \"State\", \"pop2000\": 27514, \"pop2007\": 45523, \"longitude\": -122.8930726, \"latitude\": 47.04241817, \"SHAPE\": {\"x\": -13680394.263900002, \"y\": 5949000.547900006, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}}}\n","{\"geometry\": {\"x\": -13680394.2638528, \"y\": 5949000.54792491}, \"attributes\": {\"objectid\": 4, \"city_id\": 4, \"name\": \"Olympia\", \"state\": \"WA\", \"capital\": \"State\", \"pop2000\": 42514, \"pop2007\": 45523, \"longitude\": -122.8930726, \"latitude\": 47.04241817, \"SHAPE\": {\"x\": -13680394.263900002, \"y\": 5949000.547900006, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}}}\n","========================================================================\n","{\"geometry\": {\"x\": 11666510.350300007, \"y\": -5033833.302499998, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}, \"attributes\": {\"objectid\": 10, \"city_id\": 10, \"name\": \"Cheyenne\", \"state\": \"WY\", \"capital\": \"State\", \"pop2000\": 53011, \"pop2007\": 54750, \"longitude\": 104.8020456, \"latitude\": -41.14554516, \"SHAPE\": {\"x\": 11666510.350300007, \"y\": -5033833.302499998, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}}}\n","{\"geometry\": {\"x\": -11666510.350285435, \"y\": 5033833.302497153}, \"attributes\": {\"objectid\": 10, \"city_id\": 10, \"name\": \"Cheyenne\", \"state\": \"WY\", \"capital\": \"State\", \"pop2000\": 53011, \"pop2007\": 54750, \"longitude\": -104.80204559999999, \"latitude\": 41.14554516, \"SHAPE\": {\"x\": 11666510.350300007, \"y\": -5033833.302499998, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}}}\n","========================================================================\n","{\"geometry\": {\"x\": -10857545.543799996, \"y\": 4229619.674200006, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}, \"attributes\": {\"objectid\": 13, \"city_id\": 13, \"name\": \"Oklahoma City\", \"state\": \"OKK\", \"capital\": \"State\", \"pop2000\": 506132, \"pop2007\": 552556, \"longitude\": -97.5349911, \"latitude\": 35.48230867, \"SHAPE\": {\"x\": -10857545.543799996, \"y\": 4229619.674200006, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}}}\n","{\"geometry\": {\"x\": -10857545.54377847, \"y\": 4229619.674165817}, \"attributes\": {\"objectid\": 13, \"city_id\": 13, \"name\": \"Oklahoma City\", \"state\": \"OK\", \"capital\": \"State\", \"pop2000\": 506132, \"pop2007\": 552556, \"longitude\": -97.5349911, \"latitude\": 35.48230867, \"SHAPE\": {\"x\": -10857545.543799996, \"y\": 4229619.674200006, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}}}\n","========================================================================\n","{\"geometry\": {\"x\": -10421826.864700003, \"y\": 5099899.263700008, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}, \"attributes\": {\"objectid\": 16, \"city_id\": 16, \"name\": \"Des Moines\", \"state\": \"IA\", \"capital\": \"State\", \"pop2000\": 200682, \"pop2007\": 201257, \"longitude\": -93.62086361, \"latitude\": 41.59093617, \"SHAPE\": {\"x\": -10421826.864700003, \"y\": 5099899.263700008, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}}}\n","{\"geometry\": {\"x\": -10421826.864691716, \"y\": 5099899.263692743}, \"attributes\": {\"objectid\": 16, \"city_id\": 16, \"name\": \"Des Moines\", \"state\": \"IA\", \"capital\": \"State\", \"pop2000\": 198682, \"pop2007\": 201257, \"longitude\": -93.62086361, \"latitude\": 41.59093617, \"SHAPE\": {\"x\": -10421826.864700003, \"y\": 5099899.263700008, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}}}\n","========================================================================\n"]}],"source":["from arcgis import geometry #use geometry module to project Long,Lat to X and Y\n","from copy import deepcopy\n","\n","for city_id in overlap_rows['city_id']:\n"," # get the feature to be updated\n"," original_feature = [f for f in all_features if f.attributes['city_id'] == city_id][0]\n"," feature_to_be_updated = deepcopy(original_feature)\n"," \n"," print(str(original_feature))\n"," \n"," # get the matching row from csv\n"," matching_row = cities_df_2.where(cities_df_2.city_id == city_id).dropna()\n"," \n"," #get geometries in the destination coordinate system\n"," input_geometry = {'y':float(matching_row['latitude']),\n"," 'x':float(matching_row['longitude'])}\n"," output_geometry = geometry.project(geometries = [input_geometry],\n"," in_sr = 4326, \n"," out_sr = cities_fset.spatial_reference['latestWkid'],\n"," gis = gis)\n"," \n"," # assign the updated values\n"," feature_to_be_updated.geometry = output_geometry[0]\n"," feature_to_be_updated.attributes['longitude'] = float(matching_row['longitude'])\n"," feature_to_be_updated.attributes['city_id'] = int(matching_row['city_id'])\n"," feature_to_be_updated.attributes['state'] = matching_row['state'].values[0]\n"," feature_to_be_updated.attributes['capital'] = matching_row['capital'].values[0]\n"," feature_to_be_updated.attributes['latitude'] = float(matching_row['latitude'])\n"," feature_to_be_updated.attributes['name'] = matching_row['name'].values[0]\n"," feature_to_be_updated.attributes['pop2000'] = int(matching_row['pop2000'])\n"," feature_to_be_updated.attributes['pop2007'] = int(matching_row['pop2007'])\n"," \n"," #add this to the list of features to be updated\n"," features_for_update.append(feature_to_be_updated)\n"," \n"," print(str(feature_to_be_updated))\n"," print(\"========================================================================\")"]},{"cell_type":"markdown","metadata":{},"source":["We have constructed a list of features with updated values. We can use this list to perform updates on the feature layer."]},{"cell_type":"code","execution_count":19,"metadata":{},"outputs":[{"data":{"text/plain":["[{\"geometry\": {\"x\": -13680394.2638528, \"y\": 5949000.54792491}, \"attributes\": {\"objectid\": 4, \"city_id\": 4, \"name\": \"Olympia\", \"state\": \"WA\", \"capital\": \"State\", \"pop2000\": 42514, \"pop2007\": 45523, \"longitude\": -122.8930726, \"latitude\": 47.04241817, \"SHAPE\": {\"x\": -13680394.263900002, \"y\": 5949000.547900006, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}}},\n"," {\"geometry\": {\"x\": -11666510.350285435, \"y\": 5033833.302497153}, \"attributes\": {\"objectid\": 10, \"city_id\": 10, \"name\": \"Cheyenne\", \"state\": \"WY\", \"capital\": \"State\", \"pop2000\": 53011, \"pop2007\": 54750, \"longitude\": -104.80204559999999, \"latitude\": 41.14554516, \"SHAPE\": {\"x\": 11666510.350300007, \"y\": -5033833.302499998, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}}},\n"," {\"geometry\": {\"x\": -10857545.54377847, \"y\": 4229619.674165817}, \"attributes\": {\"objectid\": 13, \"city_id\": 13, \"name\": \"Oklahoma City\", \"state\": \"OK\", \"capital\": \"State\", \"pop2000\": 506132, \"pop2007\": 552556, \"longitude\": -97.5349911, \"latitude\": 35.48230867, \"SHAPE\": {\"x\": -10857545.543799996, \"y\": 4229619.674200006, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}}},\n"," {\"geometry\": {\"x\": -10421826.864691716, \"y\": 5099899.263692743}, \"attributes\": {\"objectid\": 16, \"city_id\": 16, \"name\": \"Des Moines\", \"state\": \"IA\", \"capital\": \"State\", \"pop2000\": 198682, \"pop2007\": 201257, \"longitude\": -93.62086361, \"latitude\": 41.59093617, \"SHAPE\": {\"x\": -10421826.864700003, \"y\": 5099899.263700008, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}}}]"]},"execution_count":19,"metadata":{},"output_type":"execute_result"}],"source":["features_for_update"]},{"cell_type":"markdown","metadata":{},"source":["To update the feature layer, call the `edit_features()` method of the `FeatureLayer` object and pass the list of features to the `updates` parameter:"]},{"cell_type":"code","execution_count":20,"metadata":{},"outputs":[{"data":{"text/plain":["{'addResults': [],\n"," 'updateResults': [{'objectId': 4, 'globalId': None, 'success': True},\n"," {'objectId': 10, 'globalId': None, 'success': True},\n"," {'objectId': 13, 'globalId': None, 'success': True},\n"," {'objectId': 16, 'globalId': None, 'success': True}],\n"," 'deleteResults': [],\n"," 'attachments': {'addResults': [], 'updateResults': [], 'deleteResults': []}}"]},"execution_count":20,"metadata":{},"output_type":"execute_result"}],"source":["cities_flayer.edit_features(updates= features_for_update)"]},{"cell_type":"markdown","metadata":{},"source":["We have successfully applied corrections to those features which existed in the feature layer from the initial dataset. Next let us proceed to adding new features present only in the second csv file.\n","\n","#### Identifying new features that need to be added"]},{"cell_type":"code","execution_count":21,"metadata":{},"outputs":[{"name":"stdout","output_type":"stream","text":["(32, 8)\n"]}],"source":["#select those rows in the capitals_2.csv that do not overlap with those in capitals_1.csv\n","new_rows = cities_df_2[~cities_df_2['city_id'].isin(overlap_rows['city_id'])]\n","print(new_rows.shape)"]},{"cell_type":"code","execution_count":22,"metadata":{},"outputs":[{"data":{"text/html":["
\n","\n","\n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n","
city_idnamestatecapitalpop2000pop2007longitudelatitude
020Baton RougeLAState227818228810-91.14022730.458091
121HelenaMTState2578026007-112.02702746.595809
222BismarckNDState5553259344-100.77900046.813346
323PierreSDState1387614169-100.33638244.367964
424St. PaulMNState287151291643-93.11411844.954364
\n","
"],"text/plain":[" city_id name state capital pop2000 pop2007 longitude latitude\n","0 20 Baton Rouge LA State 227818 228810 -91.140227 30.458091\n","1 21 Helena MT State 25780 26007 -112.027027 46.595809\n","2 22 Bismarck ND State 55532 59344 -100.779000 46.813346\n","3 23 Pierre SD State 13876 14169 -100.336382 44.367964\n","4 24 St. Paul MN State 287151 291643 -93.114118 44.954364"]},"execution_count":22,"metadata":{},"output_type":"execute_result"}],"source":["new_rows.head()"]},{"cell_type":"markdown","metadata":{},"source":["Thus, of the total 36 rows in the second csv, we have determined the 32 other rows which are new and need to be appended as new features."]},{"cell_type":"markdown","metadata":{},"source":["#### Adding new features\n","Next, let us compose another `list` of `Feature` objects similar to earlier, from the `new_rows` data frame."]},{"cell_type":"code","execution_count":23,"metadata":{},"outputs":[{"name":"stdout","output_type":"stream","text":["Creating Baton Rouge\n","Creating Helena\n","Creating Bismarck\n","Creating Pierre\n","Creating St. Paul\n","Creating Madison\n","Creating Lansing\n","Creating Augusta\n","Creating Montpelier\n","Creating Albany\n","Creating Boston\n","Creating Concord\n","Creating Tallahassee\n","Creating Jackson\n","Creating Nashville\n","Creating Montgomery\n","Creating Springfield\n","Creating Indianapolis\n","Creating Frankfort\n","Creating Columbus\n","Creating Charleston\n","Creating Atlanta\n","Creating Columbia\n","Creating Raleigh\n","Creating Richmond\n","Creating Harrisburg\n","Creating Trenton\n","Creating Dover\n","Creating Washington\n","Creating Annapolis\n","Creating Hartford\n","Creating Providence\n"]}],"source":["features_to_be_added = []\n","\n","# get a template feature object\n","template_feature = deepcopy(features_for_update[0])\n","\n","# loop through each row and add to the list of features to be added\n","for row in new_rows.iterrows():\n"," new_feature = deepcopy(template_feature)\n"," \n"," #print\n"," print(\"Creating \" + row[1]['name'])\n"," \n"," #get geometries in the destination coordinate system\n"," input_geometry = {'y':float(row[1]['latitude']),\n"," 'x':float(row[1]['longitude'])}\n"," output_geometry = geometry.project(geometries = [input_geometry],\n"," in_sr = 4326, \n"," out_sr = cities_fset.spatial_reference['latestWkid'],\n"," gis = gis)\n"," \n"," # assign the updated values\n"," new_feature.geometry = output_geometry[0]\n"," new_feature.attributes['longitude'] = float(row[1]['longitude'])\n"," new_feature.attributes['city_id'] = int(row[1]['city_id'])\n"," new_feature.attributes['state'] = row[1]['state']\n"," new_feature.attributes['capital'] = row[1]['capital']\n"," new_feature.attributes['latitude'] = float(row[1]['latitude'])\n"," new_feature.attributes['name'] = row[1]['name']\n"," new_feature.attributes['pop2000'] = int(row[1]['pop2000'])\n"," new_feature.attributes['pop2007'] = int(row[1]['pop2007'])\n"," \n"," #add this to the list of features to be updated\n"," features_to_be_added.append(new_feature)"]},{"cell_type":"code","execution_count":24,"metadata":{},"outputs":[{"data":{"text/plain":["{\"geometry\": {\"x\": -10145683.671555312, \"y\": 3562570.141302621}, \"attributes\": {\"objectid\": 4, \"city_id\": 20, \"name\": \"Baton Rouge\", \"state\": \"LA\", \"capital\": \"State\", \"pop2000\": 227818, \"pop2007\": 228810, \"longitude\": -91.14022709999999, \"latitude\": 30.45809113, \"SHAPE\": {\"x\": -13680394.263900002, \"y\": 5949000.547900006, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}}}"]},"execution_count":24,"metadata":{},"output_type":"execute_result"}],"source":["# take a look at one of the features we created\n","features_to_be_added[0]"]},{"cell_type":"markdown","metadata":{},"source":["Thus, we have created a `list` of `Feature` objects with appropriate attributes and geometries. Next, to add these new features to the feature layer, call the `edit_features()` method of the `FeatureLayer` object and pass the list of `Feature` objects to the `adds` parameter:"]},{"cell_type":"code","execution_count":25,"metadata":{},"outputs":[{"data":{"text/plain":["{'addResults': [{'objectId': 20, 'globalId': None, 'success': True},\n"," {'objectId': 21, 'globalId': None, 'success': True},\n"," {'objectId': 22, 'globalId': None, 'success': True},\n"," {'objectId': 23, 'globalId': None, 'success': True},\n"," {'objectId': 24, 'globalId': None, 'success': True},\n"," {'objectId': 25, 'globalId': None, 'success': True},\n"," {'objectId': 26, 'globalId': None, 'success': True},\n"," {'objectId': 27, 'globalId': None, 'success': True},\n"," {'objectId': 28, 'globalId': None, 'success': True},\n"," {'objectId': 29, 'globalId': None, 'success': True},\n"," {'objectId': 30, 'globalId': None, 'success': True},\n"," {'objectId': 31, 'globalId': None, 'success': True},\n"," {'objectId': 32, 'globalId': None, 'success': True},\n"," {'objectId': 33, 'globalId': None, 'success': True},\n"," {'objectId': 34, 'globalId': None, 'success': True},\n"," {'objectId': 35, 'globalId': None, 'success': True},\n"," {'objectId': 36, 'globalId': None, 'success': True},\n"," {'objectId': 37, 'globalId': None, 'success': True},\n"," {'objectId': 38, 'globalId': None, 'success': True},\n"," {'objectId': 39, 'globalId': None, 'success': True},\n"," {'objectId': 40, 'globalId': None, 'success': True},\n"," {'objectId': 41, 'globalId': None, 'success': True},\n"," {'objectId': 42, 'globalId': None, 'success': True},\n"," {'objectId': 43, 'globalId': None, 'success': True},\n"," {'objectId': 44, 'globalId': None, 'success': True},\n"," {'objectId': 45, 'globalId': None, 'success': True},\n"," {'objectId': 46, 'globalId': None, 'success': True},\n"," {'objectId': 47, 'globalId': None, 'success': True},\n"," {'objectId': 48, 'globalId': None, 'success': True},\n"," {'objectId': 49, 'globalId': None, 'success': True},\n"," {'objectId': 50, 'globalId': None, 'success': True},\n"," {'objectId': 51, 'globalId': None, 'success': True}],\n"," 'updateResults': [],\n"," 'deleteResults': [],\n"," 'attachments': {'addResults': [], 'updateResults': [], 'deleteResults': []}}"]},"execution_count":25,"metadata":{},"output_type":"execute_result"}],"source":["cities_flayer.edit_features(adds = features_to_be_added)"]},{"cell_type":"markdown","metadata":{},"source":["Thus, we have successfully applied edits from second csv file. Next let us look at how we can apply edits from third csv file.\n","\n","### Apply edits from third spreadsheet\n","The next set of updates have arrived and are stored in `capitals_annex.csv`. We are told it contains additional columns for each of the features that we want to add to the feature layer.\n","\n","To start with, let us read the third csv file. Note in this sample, data is stored in csv. In reality, it could be from your enterprise database or any other data source."]},{"cell_type":"code","execution_count":26,"metadata":{},"outputs":[{"data":{"text/html":["
\n","\n","\n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n","
city_idnameclasswhiteblackameri_esasianhawn_plhispanicmalesfemales
01HonoluluCensus Designated Place7309360386892075882545716229182628189029
12JuneauCity and Borough229692483496143811610401546915242
23Boise CityCity17120414371300387030284109201493773
34OlympiaCity36246805553247312518632031922195
45SalemCity113746175020643304643199736875268172
\n","
"],"text/plain":[" city_id name class white black ameri_es \\\n","0 1 Honolulu Census Designated Place 73093 6038 689 \n","1 2 Juneau City and Borough 22969 248 3496 \n","2 3 Boise City City 171204 1437 1300 \n","3 4 Olympia City 36246 805 553 \n","4 5 Salem City 113746 1750 2064 \n","\n"," asian hawn_pl hispanic males females \n","0 207588 25457 16229 182628 189029 \n","1 1438 116 1040 15469 15242 \n","2 3870 302 8410 92014 93773 \n","3 2473 125 1863 20319 22195 \n","4 3304 643 19973 68752 68172 "]},"execution_count":26,"metadata":{},"output_type":"execute_result"}],"source":["# read the third csv set\n","csv3 = 'data/updating_gis_content/capitals_annex.csv'\n","cities_df_3 = pd.read_csv(csv3)\n","cities_df_3.head()"]},{"cell_type":"code","execution_count":27,"metadata":{},"outputs":[{"data":{"text/plain":["(51, 11)"]},"execution_count":27,"metadata":{},"output_type":"execute_result"}],"source":["#find the number of rows in the third csv\n","cities_df_3.shape"]},{"cell_type":"markdown","metadata":{},"source":["The `capitals_annex.csv` does not add new features, instead it adds additional attribute columns to existing features. It has 51 rows which were found to match the 19 + 32 rows from first and second csv files. The columns `City_ID` and `NAME` are common to all 3 spreadsheets. Next let us take a look at how we can append this additional attribute information to our feature layer.\n","\n","#### Inspecting existing fields of the feature layer\n","The `manager` property of the `FeatureLayer` object exposes a set of methods to read and update the properties and definition of feature layers."]},{"cell_type":"code","execution_count":28,"metadata":{},"outputs":[{"data":{"text/plain":["{\n"," \"name\": \"city_id\",\n"," \"type\": \"esriFieldTypeInteger\",\n"," \"alias\": \"city_id\",\n"," \"domain\": null,\n"," \"editable\": true,\n"," \"nullable\": true,\n"," \"sqlType\": \"sqlTypeInteger\",\n"," \"length\": 10\n","}"]},"execution_count":28,"metadata":{},"output_type":"execute_result"}],"source":["#Get the existing list of fields on the cities feature layer\n","cities_fields = cities_flayer.manager.properties.fields\n","\n","# Your feature layer may have multiple fields, \n","# instead of printing all, let us take a look at one of the fields:\n","cities_fields[1]"]},{"cell_type":"markdown","metadata":{},"source":["From above, we can see the representation of one of the fields. Let us loop through each of the fields and print the `name`, `alias`, `type` and `sqlType` properties"]},{"cell_type":"code","execution_count":29,"metadata":{},"outputs":[{"name":"stdout","output_type":"stream","text":["objectid | objectid | esriFieldTypeOID | sqlTypeInteger\n","city_id | city_id | esriFieldTypeInteger | sqlTypeInteger\n","name | name | esriFieldTypeString | sqlTypeVarchar\n","state | state | esriFieldTypeString | sqlTypeVarchar\n","capital | capital | esriFieldTypeString | sqlTypeVarchar\n","pop2000 | pop2000 | esriFieldTypeInteger | sqlTypeInteger\n","pop2007 | pop2007 | esriFieldTypeInteger | sqlTypeInteger\n","longitude | longitude | esriFieldTypeDouble | sqlTypeNumeric\n","latitude | latitude | esriFieldTypeDouble | sqlTypeNumeric\n"]}],"source":["for field in cities_fields:\n"," print(f\"{field.name:13}| {field.alias:13}| {field.type:25}| {field.sqlType}\")"]},{"cell_type":"markdown","metadata":{},"source":["#### Preparing additional columns to add to the feature layer\n","Now that we have an idea of how the fields are defined, we can go ahead and append new fields to the layer's definition. Once we compose a list of new fields, by calling the `add_to_definition()` method we can push those changes to the feature layer. Once the feature layer's definition is updated with new fields, we can loop through each feature and add the appropriate attribute values.\n","\n","To compose a list of new fields to be added, we start by making a copy of one of the fields as a template and start editing it. One easy part in this example is, all new fields that need to be added except one, are of the same data type: integer. With your data, this may not be the case. In such instances, you can add each field individually."]},{"cell_type":"code","execution_count":30,"metadata":{},"outputs":[{"data":{"text/plain":["{'name': 'city_id',\n"," 'type': 'esriFieldTypeInteger',\n"," 'alias': 'city_id',\n"," 'domain': None,\n"," 'editable': True,\n"," 'nullable': True,\n"," 'sqlType': 'sqlTypeInteger',\n"," 'length': 10}"]},"execution_count":30,"metadata":{},"output_type":"execute_result"}],"source":["# get a template field\n","template_field = dict(deepcopy(cities_fields[1]))\n","template_field"]},{"cell_type":"markdown","metadata":{},"source":["Let us use pandas to get the list of fields that are **new** in spread sheet 3"]},{"cell_type":"code","execution_count":31,"metadata":{},"outputs":[{"data":{"text/plain":["['ameri_es',\n"," 'asian',\n"," 'black',\n"," 'class',\n"," 'females',\n"," 'hawn_pl',\n"," 'hispanic',\n"," 'males',\n"," 'white']"]},"execution_count":31,"metadata":{},"output_type":"execute_result"}],"source":["# get the list of new fields to add from the third spreadsheet, that are not in spread sheets 1,2\n","new_field_names = list(cities_df_3.columns.difference(cities_df_1.columns))\n","new_field_names"]},{"cell_type":"markdown","metadata":{},"source":["Now loop though each new field name and create a field dictionary using the template we created earlier. Except the field titled `class` all other fields are of type `integer`."]},{"cell_type":"code","execution_count":32,"metadata":{},"outputs":[{"data":{"text/plain":["9"]},"execution_count":32,"metadata":{},"output_type":"execute_result"}],"source":["fields_to_be_added = []\n","for new_field_name in new_field_names:\n"," current_field = deepcopy(template_field)\n"," if new_field_name.lower() == 'class':\n"," current_field['sqlType'] = 'sqlTypeVarchar'\n"," current_field['type'] = 'esriFieldTypeString'\n"," current_field['length'] = 8000\n"," current_field['name'] = new_field_name.lower()\n"," current_field['alias'] = new_field_name\n"," fields_to_be_added.append(current_field)\n"," \n","len(fields_to_be_added)"]},{"cell_type":"code","execution_count":33,"metadata":{},"outputs":[{"data":{"text/plain":["{'name': 'class',\n"," 'type': 'esriFieldTypeString',\n"," 'alias': 'class',\n"," 'domain': None,\n"," 'editable': True,\n"," 'nullable': True,\n"," 'sqlType': 'sqlTypeVarchar',\n"," 'length': 8000}"]},"execution_count":33,"metadata":{},"output_type":"execute_result"}],"source":["#inspect one of the fields\n","fields_to_be_added[3]"]},{"cell_type":"markdown","metadata":{},"source":["#### Adding additional fields to the feature layer\n","The list of new fields we composed can be pushed to the server by calling `add_to_definition()` method on the `manager` property."]},{"cell_type":"code","execution_count":34,"metadata":{},"outputs":[{"data":{"text/plain":["{'success': True}"]},"execution_count":34,"metadata":{},"output_type":"execute_result"}],"source":["cities_flayer.manager.add_to_definition({'fields':fields_to_be_added})"]},{"cell_type":"markdown","metadata":{},"source":["Thus, we have successfully added new fields to our feature layer. Let us verify the new columns show up:"]},{"cell_type":"code","execution_count":35,"metadata":{},"outputs":[{"data":{"text/plain":["18"]},"execution_count":35,"metadata":{},"output_type":"execute_result"}],"source":["new_cities_fields = cities_flayer.manager.properties.fields\n","len(new_cities_fields)"]},{"cell_type":"code","execution_count":36,"metadata":{},"outputs":[{"name":"stdout","output_type":"stream","text":["objectid | esriFieldTypeOID\n","city_id | esriFieldTypeInteger\n","name | esriFieldTypeString\n","state | esriFieldTypeString\n","capital | esriFieldTypeString\n","pop2000 | esriFieldTypeInteger\n","pop2007 | esriFieldTypeInteger\n","longitude | esriFieldTypeDouble\n","latitude | esriFieldTypeDouble\n","ameri_es | esriFieldTypeInteger\n","asian | esriFieldTypeInteger\n","black | esriFieldTypeInteger\n","class | esriFieldTypeString\n","females | esriFieldTypeInteger\n","hawn_pl | esriFieldTypeInteger\n","hispanic | esriFieldTypeInteger\n","males | esriFieldTypeInteger\n","white | esriFieldTypeInteger\n"]}],"source":["for field in new_cities_fields:\n"," print(f\"{field.name:10}| {field.type}\")"]},{"cell_type":"markdown","metadata":{},"source":["#### Adding attribute values to the new columns\n","Next we can loop through each row in the third csv and add the new attribute values for these newly created columns."]},{"cell_type":"code","execution_count":37,"metadata":{},"outputs":[],"source":["# Run a fresh query on the feature layer so it includes the new features from\n","# csv2 and new columns from csv3\n","cities_fset2 = cities_flayer.query()\n","cities_features2 = cities_fset2.features"]},{"cell_type":"markdown","metadata":{},"source":["Loop through each row in the third spreadsheet, find the corresponding feature by matching the `city_id` value and apply the attribute values for the new fields."]},{"cell_type":"code","execution_count":38,"metadata":{},"outputs":[{"name":"stdout","output_type":"stream","text":["1 Adding additional attributes for: Honolulu\n","2 Adding additional attributes for: Juneau\n","3 Adding additional attributes for: Boise City\n","4 Adding additional attributes for: Olympia\n","5 Adding additional attributes for: Salem\n","6 Adding additional attributes for: Carson\n","7 Adding additional attributes for: Sacramento\n","8 Adding additional attributes for: Phoenix\n","9 Adding additional attributes for: Salt Lake City\n","10 Adding additional attributes for: Cheyenne\n","11 Adding additional attributes for: Denver\n","12 Adding additional attributes for: Santa Fe\n","13 Adding additional attributes for: Oklahoma City\n","14 Adding additional attributes for: Topeka\n","15 Adding additional attributes for: Lincoln\n","16 Adding additional attributes for: Des Moines\n","17 Adding additional attributes for: Jefferson City\n","18 Adding additional attributes for: Little Rock\n","19 Adding additional attributes for: Austin\n","20 Adding additional attributes for: Baton Rouge\n","21 Adding additional attributes for: Helena\n","22 Adding additional attributes for: Bismarck\n","23 Adding additional attributes for: Pierre\n","24 Adding additional attributes for: St. Paul\n","25 Adding additional attributes for: Madison\n","26 Adding additional attributes for: Lansing\n","27 Adding additional attributes for: Augusta\n","28 Adding additional attributes for: Montpelier\n","29 Adding additional attributes for: Albany\n","30 Adding additional attributes for: Boston\n","31 Adding additional attributes for: Concord\n","32 Adding additional attributes for: Tallahassee\n","33 Adding additional attributes for: Jackson\n","34 Adding additional attributes for: Nashville\n","35 Adding additional attributes for: Montgomery\n","36 Adding additional attributes for: Springfield\n","37 Adding additional attributes for: Indianapolis\n","38 Adding additional attributes for: Frankfort\n","39 Adding additional attributes for: Columbus\n","40 Adding additional attributes for: Charleston\n","41 Adding additional attributes for: Atlanta\n","42 Adding additional attributes for: Columbia\n","43 Adding additional attributes for: Raleigh\n","44 Adding additional attributes for: Richmond\n","45 Adding additional attributes for: Harrisburg\n","46 Adding additional attributes for: Trenton\n","47 Adding additional attributes for: Dover\n","48 Adding additional attributes for: Washington\n","49 Adding additional attributes for: Annapolis\n","50 Adding additional attributes for: Hartford\n","51 Adding additional attributes for: Providence\n"]}],"source":["features_for_update = []\n","for city_id in cities_df_3['city_id']:\n"," # get the matching row from csv\n"," matching_row = cities_df_3.where(cities_df_3.city_id == city_id).dropna()\n"," \n"," print(str(city_id) + \" Adding additional attributes for: \" + matching_row['name'].values[0])\n"," \n"," # get the feature to be updated\n"," original_feature = [f for f in cities_features2 if f.attributes['city_id'] == city_id][0]\n"," feature_to_be_updated = deepcopy(original_feature)\n"," \n"," # assign the updated values\n"," feature_to_be_updated.attributes['class'] = matching_row['class'].values[0]\n"," feature_to_be_updated.attributes['white'] = int(matching_row['white'])\n"," feature_to_be_updated.attributes['black'] = int(matching_row['black'])\n"," feature_to_be_updated.attributes['ameri_es'] = int(matching_row['ameri_es'])\n"," feature_to_be_updated.attributes['asian'] = int(matching_row['asian'])\n"," feature_to_be_updated.attributes['hawn_pl'] = int(matching_row['hawn_pl'])\n"," feature_to_be_updated.attributes['hispanic'] = int(matching_row['hispanic'])\n"," feature_to_be_updated.attributes['males'] = int(matching_row['males'])\n"," feature_to_be_updated.attributes['females'] = int(matching_row['females'])\n"," \n"," #add this to the list of features to be updated\n"," features_for_update.append(feature_to_be_updated)"]},{"cell_type":"code","execution_count":39,"metadata":{},"outputs":[{"data":{"text/plain":["{\"geometry\": {\"x\": -7950674.8190312665, \"y\": 5134585.3226086125}, \"attributes\": {\"objectid\": 51, \"city_id\": 51, \"name\": \"Providence\", \"state\": \"RI\", \"capital\": \"State\", \"pop2000\": 173618, \"pop2007\": 183731, \"longitude\": -71.42212709, \"latitude\": 41.82355466, \"ameri_es\": 1975, \"asian\": 10432, \"black\": 25243, \"class\": \"City\", \"females\": 90583, \"hawn_pl\": 270, \"hispanic\": 52146, \"males\": 83035, \"white\": 94666}}"]},"execution_count":39,"metadata":{},"output_type":"execute_result"}],"source":["# inspect one of the features\n","features_for_update[-1]"]},{"cell_type":"code","execution_count":40,"metadata":{},"outputs":[{"data":{"text/plain":["{'addResults': [],\n"," 'updateResults': [{'objectId': 1, 'globalId': None, 'success': True},\n"," {'objectId': 2, 'globalId': None, 'success': True},\n"," {'objectId': 3, 'globalId': None, 'success': True},\n"," {'objectId': 4, 'globalId': None, 'success': True},\n"," {'objectId': 5, 'globalId': None, 'success': True},\n"," {'objectId': 6, 'globalId': None, 'success': True},\n"," {'objectId': 7, 'globalId': None, 'success': True},\n"," {'objectId': 8, 'globalId': None, 'success': True},\n"," {'objectId': 9, 'globalId': None, 'success': True},\n"," {'objectId': 10, 'globalId': None, 'success': True},\n"," {'objectId': 11, 'globalId': None, 'success': True},\n"," {'objectId': 12, 'globalId': None, 'success': True},\n"," {'objectId': 13, 'globalId': None, 'success': True},\n"," {'objectId': 14, 'globalId': None, 'success': True},\n"," {'objectId': 15, 'globalId': None, 'success': True},\n"," {'objectId': 16, 'globalId': None, 'success': True},\n"," {'objectId': 17, 'globalId': None, 'success': True},\n"," {'objectId': 18, 'globalId': None, 'success': True},\n"," {'objectId': 19, 'globalId': None, 'success': True},\n"," {'objectId': 20, 'globalId': None, 'success': True},\n"," {'objectId': 21, 'globalId': None, 'success': True},\n"," {'objectId': 22, 'globalId': None, 'success': True},\n"," {'objectId': 23, 'globalId': None, 'success': True},\n"," {'objectId': 24, 'globalId': None, 'success': True},\n"," {'objectId': 25, 'globalId': None, 'success': True},\n"," {'objectId': 26, 'globalId': None, 'success': True},\n"," {'objectId': 27, 'globalId': None, 'success': True},\n"," {'objectId': 28, 'globalId': None, 'success': True},\n"," {'objectId': 29, 'globalId': None, 'success': True},\n"," {'objectId': 30, 'globalId': None, 'success': True},\n"," {'objectId': 31, 'globalId': None, 'success': True},\n"," {'objectId': 32, 'globalId': None, 'success': True},\n"," {'objectId': 33, 'globalId': None, 'success': True},\n"," {'objectId': 34, 'globalId': None, 'success': True},\n"," {'objectId': 35, 'globalId': None, 'success': True},\n"," {'objectId': 36, 'globalId': None, 'success': True},\n"," {'objectId': 37, 'globalId': None, 'success': True},\n"," {'objectId': 38, 'globalId': None, 'success': True},\n"," {'objectId': 39, 'globalId': None, 'success': True},\n"," {'objectId': 40, 'globalId': None, 'success': True},\n"," {'objectId': 41, 'globalId': None, 'success': True},\n"," {'objectId': 42, 'globalId': None, 'success': True},\n"," {'objectId': 43, 'globalId': None, 'success': True},\n"," {'objectId': 44, 'globalId': None, 'success': True},\n"," {'objectId': 45, 'globalId': None, 'success': True},\n"," {'objectId': 46, 'globalId': None, 'success': True},\n"," {'objectId': 47, 'globalId': None, 'success': True},\n"," {'objectId': 48, 'globalId': None, 'success': True},\n"," {'objectId': 49, 'globalId': None, 'success': True},\n"," {'objectId': 50, 'globalId': None, 'success': True},\n"," {'objectId': 51, 'globalId': None, 'success': True}],\n"," 'deleteResults': [],\n"," 'attachments': {'addResults': [], 'updateResults': [], 'deleteResults': []}}"]},"execution_count":40,"metadata":{},"output_type":"execute_result"}],"source":["# apply the edits to the feature layer\n","cities_flayer.edit_features(updates= features_for_update)"]},{"cell_type":"markdown","metadata":{},"source":["#### Verify the changes made so far\n","Let us run another query on the feature layer and visualize a few rows."]},{"cell_type":"code","execution_count":41,"metadata":{},"outputs":[{"data":{"text/html":["
\n","\n","\n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n","
SHAPEameri_esasianblackcapitalcity_idclassfemaleshawn_plhispaniclatitudelongitudemalesnameobjectidpop2000pop2007statewhite
0{\"x\": -17568824.553, \"y\": 2428377.352700006, \"...6892075886038State1Census Designated Place189029254571622921.305782-157.823436182628Honolulu1371657378587HI73093
1{\"x\": -14973760.769500002, \"y\": 8041504.674200...34961438248State2City and Borough15242116104058.351418-134.51158215469Juneau23071131592AK22969
2{\"x\": -12939516.521100001, \"y\": 5405860.248099...130038701437State3City93773302841043.613736-116.23765592014Boise City3185787203529ID171204
3{\"x\": -13680394.263852797, \"y\": 5949000.547924...5532473805State4City22195125186347.042418-122.89307320319Olympia44251445523WA36246
4{\"x\": -13695542.842799995, \"y\": 5610682.544100...206433041750State5City681726431997344.931109-123.02915568752Salem5136924152039OR113746
\n","
"],"text/plain":[" SHAPE ameri_es asian black \\\n","0 {\"x\": -17568824.553, \"y\": 2428377.352700006, \"... 689 207588 6038 \n","1 {\"x\": -14973760.769500002, \"y\": 8041504.674200... 3496 1438 248 \n","2 {\"x\": -12939516.521100001, \"y\": 5405860.248099... 1300 3870 1437 \n","3 {\"x\": -13680394.263852797, \"y\": 5949000.547924... 553 2473 805 \n","4 {\"x\": -13695542.842799995, \"y\": 5610682.544100... 2064 3304 1750 \n","\n"," capital city_id class females hawn_pl hispanic \\\n","0 State 1 Census Designated Place 189029 25457 16229 \n","1 State 2 City and Borough 15242 116 1040 \n","2 State 3 City 93773 302 8410 \n","3 State 4 City 22195 125 1863 \n","4 State 5 City 68172 643 19973 \n","\n"," latitude longitude males name objectid pop2000 pop2007 \\\n","0 21.305782 -157.823436 182628 Honolulu 1 371657 378587 \n","1 58.351418 -134.511582 15469 Juneau 2 30711 31592 \n","2 43.613736 -116.237655 92014 Boise City 3 185787 203529 \n","3 47.042418 -122.893073 20319 Olympia 4 42514 45523 \n","4 44.931109 -123.029155 68752 Salem 5 136924 152039 \n","\n"," state white \n","0 HI 73093 \n","1 AK 22969 \n","2 ID 171204 \n","3 WA 36246 \n","4 OR 113746 "]},"execution_count":41,"metadata":{},"output_type":"execute_result"}],"source":["cities_fset3 = cities_flayer.query()\n","cities_fset3.sdf.head(5)"]},{"cell_type":"markdown","metadata":{},"source":["## Conclusion\n","In this sample, we observed an edit intensive method to keep feature layers updated. We published data from first spreadsheet as a feature layer. We then updated existing features from second spread sheet (used geometry module to project the coordinates in the process), and added new features. The third spreadsheet presented additional attribute columns which were added to the feature layer by editing its definition and then updating the features with this additional data.\n","\n","This method is editing intensive and you may choose this when the number of features to edit is less or if you needed to selectively update certain features as updates come in.\n","\n","An alternate method is to overwrite the feature layer altogether when you always have current information coming in. This method is explained in the sample [Overwriting feature layers](https://developers.arcgis.com/python/sample-notebooks/overwriting-feature-layers)"]}],"metadata":{"anaconda-cloud":{},"esriNotebookRuntime":{"notebookRuntimeName":"ArcGIS Notebook Python 3 Standard","notebookRuntimeVersion":"4.0"},"kernelspec":{"display_name":"Python 3","language":"python","name":"python3"},"language_info":{"codemirror_mode":{"name":"ipython","version":3},"file_extension":".py","mimetype":"text/x-python","name":"python","nbconvert_exporter":"python","pygments_lexer":"ipython3","version":"3.8.2"},"toc":{"base_numbering":1,"nav_menu":{},"number_sections":true,"sideBar":true,"skip_h1_title":false,"title_cell":"Table of Contents","title_sidebar":"Contents","toc_cell":true,"toc_position":{},"toc_section_display":true,"toc_window_display":true}},"nbformat":4,"nbformat_minor":1} From 4576738d61eb730e9e670c06d32bca0dbffccc0c Mon Sep 17 00:00:00 2001 From: Shuang Song <103439749+cariashuang0417@users.noreply.github.com> Date: Thu, 24 Oct 2024 16:13:32 -0400 Subject: [PATCH 3/3] type --- ...updating_features_in_a_feature_layer.ipynb | 2360 ++++++++++++++++- 1 file changed, 2359 insertions(+), 1 deletion(-) diff --git a/samples/05_content_publishers/updating_features_in_a_feature_layer.ipynb b/samples/05_content_publishers/updating_features_in_a_feature_layer.ipynb index de21deee9d..ba2808f2d5 100644 --- a/samples/05_content_publishers/updating_features_in_a_feature_layer.ipynb +++ b/samples/05_content_publishers/updating_features_in_a_feature_layer.ipynb @@ -1 +1,2359 @@ -{"cells":[{"cell_type":"markdown","metadata":{},"source":["# Updating features in a feature layer\n","\n","As content publishers, you may be required to keep certain web layers up to date. As new data arrives, you may have to append new features, update existing features etc. There are a couple of different options to accomplish this:\n"," \n"," - Method 1: editing individual features as updated datasets are available\n"," - Method 2: overwriting feature layers altogether with updated datasets\n"," \n","Depending on the number of features that are updated, your workflow requirements, you may adopt either or both kinds of update mechanisms.\n","\n","In this sample, we explore the first method. For **Method 2**, refer to the sample titled [Overwriting feature layers](https://developers.arcgis.com/python/sample-notebooks/overwriting-feature-layers)\n","\n","**Note**: To run this sample, you need the ``pandas`` library in your conda environment. If you don't have the library, install it by running the following command from cmd.exe or your shell\n","```\n","conda install pandas```"]},{"cell_type":"markdown","metadata":{"toc":true},"source":["

Table of Contents

\n",""]},{"cell_type":"code","execution_count":3,"metadata":{},"outputs":[],"source":["# Connect to the GIS\n","from arcgis.gis import GIS\n","from arcgis import features\n","import pandas as pd"]},{"cell_type":"code","execution_count":4,"metadata":{},"outputs":[{"name":"stdout","output_type":"stream","text":["Enter password: ········\n"]}],"source":["#Access the portal using \"amazing_arcgis_123\" as password for the given Username. \n","gis = GIS(\"https://pythonapi.playground.esri.com/portal\")"]},{"cell_type":"markdown","metadata":{},"source":["## Updating feature layer by editing individual features\n","\n","Let us consider a scenario where we need to update a feature layer containing the capital cities of the US. We have 3 csv datasets simulating an update workflow as described below:\n","\n"," 1. capitals_1.csv -- contains the initial, incomplete dataset\n"," 2. capitals_2.csv -- contains additional points and updates to existing points, building on top of capitals_1.csv\n"," 3. capitals_annex.csv -- an alternate table containing additional attribute information\n"," \n","Our goal is to update the feature layer with each of these datasets doing the necessary edit operations."]},{"cell_type":"markdown","metadata":{},"source":["### Publish the cities feature layer using the initial dataset"]},{"cell_type":"code","execution_count":5,"metadata":{},"outputs":[{"data":{"text/html":["
\n","\n","\n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n","
city_idnamestatecapitalpop2000pop2007longitudelatitude
01HonoluluHIState371657378587-157.82343621.305782
12JuneauAKState3071131592-134.51158258.351418
23Boise CityIDState185787203529-116.23765543.613736
34OlympiaWAState2751445523-122.89307347.042418
45SalemORState136924152039-123.02915544.931109
\n","
"],"text/plain":[" city_id name state capital pop2000 pop2007 longitude latitude\n","0 1 Honolulu HI State 371657 378587 -157.823436 21.305782\n","1 2 Juneau AK State 30711 31592 -134.511582 58.351418\n","2 3 Boise City ID State 185787 203529 -116.237655 43.613736\n","3 4 Olympia WA State 27514 45523 -122.893073 47.042418\n","4 5 Salem OR State 136924 152039 -123.029155 44.931109"]},"execution_count":5,"metadata":{},"output_type":"execute_result"}],"source":["# read the initial csv\n","csv1 = 'data/updating_gis_content/capitals_1.csv'\n","cities_df_1 = pd.read_csv(csv1)\n","cities_df_1.head()"]},{"cell_type":"code","execution_count":6,"metadata":{},"outputs":[{"data":{"text/plain":["(19, 8)"]},"execution_count":6,"metadata":{},"output_type":"execute_result"}],"source":["# print the number of records in this csv\n","cities_df_1.shape"]},{"cell_type":"markdown","metadata":{},"source":["As you can see, this dataset only contains 19 rows or 19 capital cities. It is not the complete dataset."]},{"cell_type":"markdown","metadata":{},"source":["Let's add this `csv` as a portal item. Adding the item creates a CSV item and uploads the original file to the portal, establishing a link between the item and the original file name. Therefore, we need a unique name for the file to guarantee it does not collide with any file of the same name that may have been uploaded by the same user. We'll use standard library modules to copy the file and give it a new name so we can add it to the portal"]},{"cell_type":"code","execution_count":7,"metadata":{},"outputs":[],"source":["import os\n","import datetime as dt\n","import shutil\n","\n","# assign variables to locations on the file system \n","cwd = os.path.abspath(os.getcwd())\n","data_pth = os.path.join(cwd, r'data/updating_gis_content/')\n","\n","# create a unique timestamp string to append to the file name\n","now_ts = str(int(dt.datetime.now().timestamp()))\n","\n","# copy the file, appending the unique string and assign it to a variable\n","my_csv = shutil.copyfile(os.path.abspath(csv1),\n"," os.path.join(data_pth, 'capitals_1_' + now_ts + '.csv'))"]},{"cell_type":"code","execution_count":8,"metadata":{},"outputs":[{"data":{"text/plain":["'\\\\path\\\\on\\\\computer\\\\05_content_publishers\\\\data/updating_gis_content/capitals_1_1548875507.csv'"]},"execution_count":8,"metadata":{},"output_type":"execute_result"}],"source":["my_csv"]},{"cell_type":"code","execution_count":9,"metadata":{},"outputs":[{"data":{"text/html":["
\n","
\n"," \n"," \n"," \n","
\n","\n","
\n"," USA Capitals spreadsheet 1548875507\n"," \n","
CSV by arcgis_python\n","
Last Modified: January 30, 2019\n","
0 comments, 0 views\n","
\n","
\n"," "],"text/plain":[""]},"execution_count":9,"metadata":{},"output_type":"execute_result"}],"source":["# add the initial csv file and publish that as a web layer\n","item_prop = {'title':'USA Capitals spreadsheet ' + now_ts, \"item_type\": \"CSV\"}\n","root_folder = gis.content.folders.get()\n","csv_item = root_folder.add(item_properties=item_prop, file=my_csv).result()\n","csv_item"]},{"cell_type":"markdown","metadata":{},"source":["This spreadsheet has co-ordinates as `latitude` and `longitude` columns which will be used for geometries during publishing."]},{"cell_type":"code","execution_count":10,"metadata":{},"outputs":[{"data":{"text/html":["
\n","
\n"," \n"," \n"," \n","
\n","\n","
\n"," USA Capitals spreadsheet 1548875507\n"," \n","
Feature Layer Collection by arcgis_python\n","
Last Modified: January 30, 2019\n","
0 comments, 0 views\n","
\n","
\n"," "],"text/plain":[""]},"execution_count":10,"metadata":{},"output_type":"execute_result"}],"source":["# publish the csv item into a feature layer\n","cities_item = csv_item.publish()\n","cities_item"]},{"cell_type":"code","execution_count":19,"metadata":{},"outputs":[{"data":{"text/html":["
\n","
\n"," \n"," \n"," \n","
\n","\n","
\n"," USA Capitals\n"," \n","
Feature Layer Collection by arcgis_python\n","
Last Modified: January 30, 2019\n","
0 comments, 0 views\n","
\n","
\n"," "],"text/plain":[""]},"execution_count":19,"metadata":{},"output_type":"execute_result"}],"source":["# update the item metadata\n","item_prop = {'title':'USA Capitals'}\n","cities_item.update(item_properties = item_prop, thumbnail='data/updating_gis_content/capital_cities.png')\n","cities_item"]},{"cell_type":"markdown","metadata":{},"source":["### Apply updates from the second spreadsheet\n","The next set of updates have arrived and are stored in `capitals_2.csv`. We are told it contains corrections for the original set in addition to new features. We need to figure out which rows have changed, apply `update` operation on those, then apply `add` operation to new rows.\n","\n","To start with, let us read the second csv file. Note, in this sample, data is stored in csv. In reality, it could be from your enterprise database or any other data source."]},{"cell_type":"code","execution_count":11,"metadata":{},"outputs":[{"data":{"text/html":["
\n","\n","\n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n","
city_idnamestatecapitalpop2000pop2007longitudelatitude
020Baton RougeLAState227818228810-91.14022730.458091
121HelenaMTState2578026007-112.02702746.595809
222BismarckNDState5553259344-100.77900046.813346
323PierreSDState1387614169-100.33638244.367964
424St. PaulMNState287151291643-93.11411844.954364
\n","
"],"text/plain":[" city_id name state capital pop2000 pop2007 longitude latitude\n","0 20 Baton Rouge LA State 227818 228810 -91.140227 30.458091\n","1 21 Helena MT State 25780 26007 -112.027027 46.595809\n","2 22 Bismarck ND State 55532 59344 -100.779000 46.813346\n","3 23 Pierre SD State 13876 14169 -100.336382 44.367964\n","4 24 St. Paul MN State 287151 291643 -93.114118 44.954364"]},"execution_count":11,"metadata":{},"output_type":"execute_result"}],"source":["# read the second csv set\n","csv2 = 'data/updating_gis_content/capitals_2.csv'\n","cities_df_2 = pd.read_csv(csv2)\n","cities_df_2.head()"]},{"cell_type":"code","execution_count":12,"metadata":{},"outputs":[{"data":{"text/plain":["(36, 8)"]},"execution_count":12,"metadata":{},"output_type":"execute_result"}],"source":["# get the dimensions of this csv\n","cities_df_2.shape"]},{"cell_type":"markdown","metadata":{},"source":["#### Identifying existing features that need to be updated\n","\n","To identify features that need to be updated, let us read the attribute table of the published feature layer and compare that against the second csv. To read the attribute table, we perform a `query()` on the feature layer which returns us an `arcgis.feature.FeatureSet` object. Refer to the guide pages on [accessing features from feature layers](https://developers.arcgis.com/python/guide/working-with-feature-layers-and-features/) to learn more about this.\n","\n","Note, at this point, we could work with the `cities_df_1` dataframe we created from the original csv file. However, in practice you may not always have the original dataset or your feature layer might have undergone edits after it was published. Hence, we query the feature layer directly."]},{"cell_type":"code","execution_count":13,"metadata":{},"outputs":[{"data":{"text/html":["
\n","\n","\n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n","
SHAPEcapitalcity_idlatitudelongitudenameobjectidpop2000pop2007state
0{\"x\": -17568824.553, \"y\": 2428377.352700006, \"...State121.305782-157.823436Honolulu1371657378587HI
1{\"x\": -14973760.769500002, \"y\": 8041504.674200...State258.351418-134.511582Juneau23071131592AK
2{\"x\": -12939516.521100001, \"y\": 5405860.248099...State343.613736-116.237655Boise City3185787203529ID
3{\"x\": -13680394.263900002, \"y\": 5949000.547900...State447.042418-122.893073Olympia42751445523WA
4{\"x\": -13695542.842799995, \"y\": 5610682.544100...State544.931109-123.029155Salem5136924152039OR
\n","
"],"text/plain":[" SHAPE capital city_id \\\n","0 {\"x\": -17568824.553, \"y\": 2428377.352700006, \"... State 1 \n","1 {\"x\": -14973760.769500002, \"y\": 8041504.674200... State 2 \n","2 {\"x\": -12939516.521100001, \"y\": 5405860.248099... State 3 \n","3 {\"x\": -13680394.263900002, \"y\": 5949000.547900... State 4 \n","4 {\"x\": -13695542.842799995, \"y\": 5610682.544100... State 5 \n","\n"," latitude longitude name objectid pop2000 pop2007 state \n","0 21.305782 -157.823436 Honolulu 1 371657 378587 HI \n","1 58.351418 -134.511582 Juneau 2 30711 31592 AK \n","2 43.613736 -116.237655 Boise City 3 185787 203529 ID \n","3 47.042418 -122.893073 Olympia 4 27514 45523 WA \n","4 44.931109 -123.029155 Salem 5 136924 152039 OR "]},"execution_count":13,"metadata":{},"output_type":"execute_result"}],"source":["cities_flayer = cities_item.layers[0]\n","cities_fset = cities_flayer.query() #querying without any conditions returns all the features\n","cities_fset.sdf.head()"]},{"cell_type":"markdown","metadata":{},"source":["The `city_id` column is common between both the datasets. Next, let us perform an `inner` join with the table from feature layer as left and updated csv as right. Inner joins will yield those rows that are present in both tables. Learn more about [inner joins here](https://www.w3schools.com/sql/sql_join_inner.asp)."]},{"cell_type":"code","execution_count":14,"metadata":{},"outputs":[{"data":{"text/html":["
\n","\n","\n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n","
SHAPEcapital_xcity_idlatitude_xlongitude_xname_xobjectidpop2000_xpop2007_xstate_xname_ystate_ycapital_ypop2000_ypop2007_ylongitude_ylatitude_y
0{'x': -13680394.263900002, 'y': 5949000.547900...State447.042418-122.893073Olympia42751445523WAOlympiaWAState4251445523-122.89307347.042418
1{'x': 11666510.350300007, 'y': -5033833.302499...State10-41.145545104.802046Cheyenne105301154750WYCheyenneWYState5301154750-104.80204641.145545
2{'x': -10857545.543799996, 'y': 4229619.674200...State1335.482309-97.534991Oklahoma City13506132552556OKKOklahoma CityOKState506132552556-97.53499135.482309
3{'x': -10421826.864700003, 'y': 5099899.263700...State1641.590936-93.620864Des Moines16200682201257IADes MoinesIAState198682201257-93.62086441.590936
\n","
"],"text/plain":[" SHAPE capital_x city_id \\\n","0 {'x': -13680394.263900002, 'y': 5949000.547900... State 4 \n","1 {'x': 11666510.350300007, 'y': -5033833.302499... State 10 \n","2 {'x': -10857545.543799996, 'y': 4229619.674200... State 13 \n","3 {'x': -10421826.864700003, 'y': 5099899.263700... State 16 \n","\n"," latitude_x longitude_x name_x objectid pop2000_x pop2007_x \\\n","0 47.042418 -122.893073 Olympia 4 27514 45523 \n","1 -41.145545 104.802046 Cheyenne 10 53011 54750 \n","2 35.482309 -97.534991 Oklahoma City 13 506132 552556 \n","3 41.590936 -93.620864 Des Moines 16 200682 201257 \n","\n"," state_x name_y state_y capital_y pop2000_y pop2007_y longitude_y \\\n","0 WA Olympia WA State 42514 45523 -122.893073 \n","1 WY Cheyenne WY State 53011 54750 -104.802046 \n","2 OKK Oklahoma City OK State 506132 552556 -97.534991 \n","3 IA Des Moines IA State 198682 201257 -93.620864 \n","\n"," latitude_y \n","0 47.042418 \n","1 41.145545 \n","2 35.482309 \n","3 41.590936 "]},"execution_count":14,"metadata":{},"output_type":"execute_result"}],"source":["overlap_rows = pd.merge(left = cities_fset.sdf, right = cities_df_2, how='inner',\n"," on = 'city_id')\n","overlap_rows"]},{"cell_type":"markdown","metadata":{},"source":["Thus, of 19 features in original and 36 features in second csv, 4 features are common. Inspecting the table, we find certain columns are updated, for instance, Cheyenne has its coordinates corrected, Oklahoma City has its state abbreviation corrected and similarly other cities have one of their attribute columns updated.\n","\n","We could either update individual attribute values for these 4 features or update all attribute values with the latest csv. Below, we are performing the latter as it is simple and fast.\n","\n","#### Perform updates to the existing features"]},{"cell_type":"code","execution_count":15,"metadata":{},"outputs":[],"source":["features_for_update = [] #list containing corrected features\n","all_features = cities_fset.features"]},{"cell_type":"code","execution_count":16,"metadata":{},"outputs":[{"data":{"text/plain":["{\"geometry\": {\"x\": -17568824.553, \"y\": 2428377.352700006, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}, \"attributes\": {\"objectid\": 1, \"city_id\": 1, \"name\": \"Honolulu\", \"state\": \"HI\", \"capital\": \"State\", \"pop2000\": 371657, \"pop2007\": 378587, \"longitude\": -157.8234362, \"latitude\": 21.30578163, \"SHAPE\": {\"x\": -17568824.553, \"y\": 2428377.352700006, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}}}"]},"execution_count":16,"metadata":{},"output_type":"execute_result"}],"source":["# inspect one of the features\n","all_features[0]"]},{"cell_type":"markdown","metadata":{},"source":["Note the X and Y geometry values are different from decimal degree coordinates present in Longitude and Latitude fields. To perform geometry edits, we need to project the coordinates to match that of the feature layer."]},{"cell_type":"code","execution_count":17,"metadata":{},"outputs":[{"data":{"text/plain":["{'wkid': 102100, 'latestWkid': 3857}"]},"execution_count":17,"metadata":{},"output_type":"execute_result"}],"source":["# get the spatial reference of the features since we need to update the geometry\n","cities_fset.spatial_reference"]},{"cell_type":"markdown","metadata":{},"source":["Below, we prepare updated geometries and attributes for each of the 4 features we determined above. We use the `arcgis.geometry` module to `project` the coordinates from geographic to projected coordinate system. The cell below prints the original `Feature` objects followed by the updated ones. If you look closely, you can find the differences."]},{"cell_type":"code","execution_count":18,"metadata":{},"outputs":[{"name":"stdout","output_type":"stream","text":["{\"geometry\": {\"x\": -13680394.263900002, \"y\": 5949000.547900006, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}, \"attributes\": {\"objectid\": 4, \"city_id\": 4, \"name\": \"Olympia\", \"state\": \"WA\", \"capital\": \"State\", \"pop2000\": 27514, \"pop2007\": 45523, \"longitude\": -122.8930726, \"latitude\": 47.04241817, \"SHAPE\": {\"x\": -13680394.263900002, \"y\": 5949000.547900006, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}}}\n","{\"geometry\": {\"x\": -13680394.2638528, \"y\": 5949000.54792491}, \"attributes\": {\"objectid\": 4, \"city_id\": 4, \"name\": \"Olympia\", \"state\": \"WA\", \"capital\": \"State\", \"pop2000\": 42514, \"pop2007\": 45523, \"longitude\": -122.8930726, \"latitude\": 47.04241817, \"SHAPE\": {\"x\": -13680394.263900002, \"y\": 5949000.547900006, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}}}\n","========================================================================\n","{\"geometry\": {\"x\": 11666510.350300007, \"y\": -5033833.302499998, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}, \"attributes\": {\"objectid\": 10, \"city_id\": 10, \"name\": \"Cheyenne\", \"state\": \"WY\", \"capital\": \"State\", \"pop2000\": 53011, \"pop2007\": 54750, \"longitude\": 104.8020456, \"latitude\": -41.14554516, \"SHAPE\": {\"x\": 11666510.350300007, \"y\": -5033833.302499998, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}}}\n","{\"geometry\": {\"x\": -11666510.350285435, \"y\": 5033833.302497153}, \"attributes\": {\"objectid\": 10, \"city_id\": 10, \"name\": \"Cheyenne\", \"state\": \"WY\", \"capital\": \"State\", \"pop2000\": 53011, \"pop2007\": 54750, \"longitude\": -104.80204559999999, \"latitude\": 41.14554516, \"SHAPE\": {\"x\": 11666510.350300007, \"y\": -5033833.302499998, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}}}\n","========================================================================\n","{\"geometry\": {\"x\": -10857545.543799996, \"y\": 4229619.674200006, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}, \"attributes\": {\"objectid\": 13, \"city_id\": 13, \"name\": \"Oklahoma City\", \"state\": \"OKK\", \"capital\": \"State\", \"pop2000\": 506132, \"pop2007\": 552556, \"longitude\": -97.5349911, \"latitude\": 35.48230867, \"SHAPE\": {\"x\": -10857545.543799996, \"y\": 4229619.674200006, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}}}\n","{\"geometry\": {\"x\": -10857545.54377847, \"y\": 4229619.674165817}, \"attributes\": {\"objectid\": 13, \"city_id\": 13, \"name\": \"Oklahoma City\", \"state\": \"OK\", \"capital\": \"State\", \"pop2000\": 506132, \"pop2007\": 552556, \"longitude\": -97.5349911, \"latitude\": 35.48230867, \"SHAPE\": {\"x\": -10857545.543799996, \"y\": 4229619.674200006, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}}}\n","========================================================================\n","{\"geometry\": {\"x\": -10421826.864700003, \"y\": 5099899.263700008, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}, \"attributes\": {\"objectid\": 16, \"city_id\": 16, \"name\": \"Des Moines\", \"state\": \"IA\", \"capital\": \"State\", \"pop2000\": 200682, \"pop2007\": 201257, \"longitude\": -93.62086361, \"latitude\": 41.59093617, \"SHAPE\": {\"x\": -10421826.864700003, \"y\": 5099899.263700008, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}}}\n","{\"geometry\": {\"x\": -10421826.864691716, \"y\": 5099899.263692743}, \"attributes\": {\"objectid\": 16, \"city_id\": 16, \"name\": \"Des Moines\", \"state\": \"IA\", \"capital\": \"State\", \"pop2000\": 198682, \"pop2007\": 201257, \"longitude\": -93.62086361, \"latitude\": 41.59093617, \"SHAPE\": {\"x\": -10421826.864700003, \"y\": 5099899.263700008, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}}}\n","========================================================================\n"]}],"source":["from arcgis import geometry #use geometry module to project Long,Lat to X and Y\n","from copy import deepcopy\n","\n","for city_id in overlap_rows['city_id']:\n"," # get the feature to be updated\n"," original_feature = [f for f in all_features if f.attributes['city_id'] == city_id][0]\n"," feature_to_be_updated = deepcopy(original_feature)\n"," \n"," print(str(original_feature))\n"," \n"," # get the matching row from csv\n"," matching_row = cities_df_2.where(cities_df_2.city_id == city_id).dropna()\n"," \n"," #get geometries in the destination coordinate system\n"," input_geometry = {'y':float(matching_row['latitude']),\n"," 'x':float(matching_row['longitude'])}\n"," output_geometry = geometry.project(geometries = [input_geometry],\n"," in_sr = 4326, \n"," out_sr = cities_fset.spatial_reference['latestWkid'],\n"," gis = gis)\n"," \n"," # assign the updated values\n"," feature_to_be_updated.geometry = output_geometry[0]\n"," feature_to_be_updated.attributes['longitude'] = float(matching_row['longitude'])\n"," feature_to_be_updated.attributes['city_id'] = int(matching_row['city_id'])\n"," feature_to_be_updated.attributes['state'] = matching_row['state'].values[0]\n"," feature_to_be_updated.attributes['capital'] = matching_row['capital'].values[0]\n"," feature_to_be_updated.attributes['latitude'] = float(matching_row['latitude'])\n"," feature_to_be_updated.attributes['name'] = matching_row['name'].values[0]\n"," feature_to_be_updated.attributes['pop2000'] = int(matching_row['pop2000'])\n"," feature_to_be_updated.attributes['pop2007'] = int(matching_row['pop2007'])\n"," \n"," #add this to the list of features to be updated\n"," features_for_update.append(feature_to_be_updated)\n"," \n"," print(str(feature_to_be_updated))\n"," print(\"========================================================================\")"]},{"cell_type":"markdown","metadata":{},"source":["We have constructed a list of features with updated values. We can use this list to perform updates on the feature layer."]},{"cell_type":"code","execution_count":19,"metadata":{},"outputs":[{"data":{"text/plain":["[{\"geometry\": {\"x\": -13680394.2638528, \"y\": 5949000.54792491}, \"attributes\": {\"objectid\": 4, \"city_id\": 4, \"name\": \"Olympia\", \"state\": \"WA\", \"capital\": \"State\", \"pop2000\": 42514, \"pop2007\": 45523, \"longitude\": -122.8930726, \"latitude\": 47.04241817, \"SHAPE\": {\"x\": -13680394.263900002, \"y\": 5949000.547900006, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}}},\n"," {\"geometry\": {\"x\": -11666510.350285435, \"y\": 5033833.302497153}, \"attributes\": {\"objectid\": 10, \"city_id\": 10, \"name\": \"Cheyenne\", \"state\": \"WY\", \"capital\": \"State\", \"pop2000\": 53011, \"pop2007\": 54750, \"longitude\": -104.80204559999999, \"latitude\": 41.14554516, \"SHAPE\": {\"x\": 11666510.350300007, \"y\": -5033833.302499998, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}}},\n"," {\"geometry\": {\"x\": -10857545.54377847, \"y\": 4229619.674165817}, \"attributes\": {\"objectid\": 13, \"city_id\": 13, \"name\": \"Oklahoma City\", \"state\": \"OK\", \"capital\": \"State\", \"pop2000\": 506132, \"pop2007\": 552556, \"longitude\": -97.5349911, \"latitude\": 35.48230867, \"SHAPE\": {\"x\": -10857545.543799996, \"y\": 4229619.674200006, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}}},\n"," {\"geometry\": {\"x\": -10421826.864691716, \"y\": 5099899.263692743}, \"attributes\": {\"objectid\": 16, \"city_id\": 16, \"name\": \"Des Moines\", \"state\": \"IA\", \"capital\": \"State\", \"pop2000\": 198682, \"pop2007\": 201257, \"longitude\": -93.62086361, \"latitude\": 41.59093617, \"SHAPE\": {\"x\": -10421826.864700003, \"y\": 5099899.263700008, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}}}]"]},"execution_count":19,"metadata":{},"output_type":"execute_result"}],"source":["features_for_update"]},{"cell_type":"markdown","metadata":{},"source":["To update the feature layer, call the `edit_features()` method of the `FeatureLayer` object and pass the list of features to the `updates` parameter:"]},{"cell_type":"code","execution_count":20,"metadata":{},"outputs":[{"data":{"text/plain":["{'addResults': [],\n"," 'updateResults': [{'objectId': 4, 'globalId': None, 'success': True},\n"," {'objectId': 10, 'globalId': None, 'success': True},\n"," {'objectId': 13, 'globalId': None, 'success': True},\n"," {'objectId': 16, 'globalId': None, 'success': True}],\n"," 'deleteResults': [],\n"," 'attachments': {'addResults': [], 'updateResults': [], 'deleteResults': []}}"]},"execution_count":20,"metadata":{},"output_type":"execute_result"}],"source":["cities_flayer.edit_features(updates= features_for_update)"]},{"cell_type":"markdown","metadata":{},"source":["We have successfully applied corrections to those features which existed in the feature layer from the initial dataset. Next let us proceed to adding new features present only in the second csv file.\n","\n","#### Identifying new features that need to be added"]},{"cell_type":"code","execution_count":21,"metadata":{},"outputs":[{"name":"stdout","output_type":"stream","text":["(32, 8)\n"]}],"source":["#select those rows in the capitals_2.csv that do not overlap with those in capitals_1.csv\n","new_rows = cities_df_2[~cities_df_2['city_id'].isin(overlap_rows['city_id'])]\n","print(new_rows.shape)"]},{"cell_type":"code","execution_count":22,"metadata":{},"outputs":[{"data":{"text/html":["
\n","\n","\n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n","
city_idnamestatecapitalpop2000pop2007longitudelatitude
020Baton RougeLAState227818228810-91.14022730.458091
121HelenaMTState2578026007-112.02702746.595809
222BismarckNDState5553259344-100.77900046.813346
323PierreSDState1387614169-100.33638244.367964
424St. PaulMNState287151291643-93.11411844.954364
\n","
"],"text/plain":[" city_id name state capital pop2000 pop2007 longitude latitude\n","0 20 Baton Rouge LA State 227818 228810 -91.140227 30.458091\n","1 21 Helena MT State 25780 26007 -112.027027 46.595809\n","2 22 Bismarck ND State 55532 59344 -100.779000 46.813346\n","3 23 Pierre SD State 13876 14169 -100.336382 44.367964\n","4 24 St. Paul MN State 287151 291643 -93.114118 44.954364"]},"execution_count":22,"metadata":{},"output_type":"execute_result"}],"source":["new_rows.head()"]},{"cell_type":"markdown","metadata":{},"source":["Thus, of the total 36 rows in the second csv, we have determined the 32 other rows which are new and need to be appended as new features."]},{"cell_type":"markdown","metadata":{},"source":["#### Adding new features\n","Next, let us compose another `list` of `Feature` objects similar to earlier, from the `new_rows` data frame."]},{"cell_type":"code","execution_count":23,"metadata":{},"outputs":[{"name":"stdout","output_type":"stream","text":["Creating Baton Rouge\n","Creating Helena\n","Creating Bismarck\n","Creating Pierre\n","Creating St. Paul\n","Creating Madison\n","Creating Lansing\n","Creating Augusta\n","Creating Montpelier\n","Creating Albany\n","Creating Boston\n","Creating Concord\n","Creating Tallahassee\n","Creating Jackson\n","Creating Nashville\n","Creating Montgomery\n","Creating Springfield\n","Creating Indianapolis\n","Creating Frankfort\n","Creating Columbus\n","Creating Charleston\n","Creating Atlanta\n","Creating Columbia\n","Creating Raleigh\n","Creating Richmond\n","Creating Harrisburg\n","Creating Trenton\n","Creating Dover\n","Creating Washington\n","Creating Annapolis\n","Creating Hartford\n","Creating Providence\n"]}],"source":["features_to_be_added = []\n","\n","# get a template feature object\n","template_feature = deepcopy(features_for_update[0])\n","\n","# loop through each row and add to the list of features to be added\n","for row in new_rows.iterrows():\n"," new_feature = deepcopy(template_feature)\n"," \n"," #print\n"," print(\"Creating \" + row[1]['name'])\n"," \n"," #get geometries in the destination coordinate system\n"," input_geometry = {'y':float(row[1]['latitude']),\n"," 'x':float(row[1]['longitude'])}\n"," output_geometry = geometry.project(geometries = [input_geometry],\n"," in_sr = 4326, \n"," out_sr = cities_fset.spatial_reference['latestWkid'],\n"," gis = gis)\n"," \n"," # assign the updated values\n"," new_feature.geometry = output_geometry[0]\n"," new_feature.attributes['longitude'] = float(row[1]['longitude'])\n"," new_feature.attributes['city_id'] = int(row[1]['city_id'])\n"," new_feature.attributes['state'] = row[1]['state']\n"," new_feature.attributes['capital'] = row[1]['capital']\n"," new_feature.attributes['latitude'] = float(row[1]['latitude'])\n"," new_feature.attributes['name'] = row[1]['name']\n"," new_feature.attributes['pop2000'] = int(row[1]['pop2000'])\n"," new_feature.attributes['pop2007'] = int(row[1]['pop2007'])\n"," \n"," #add this to the list of features to be updated\n"," features_to_be_added.append(new_feature)"]},{"cell_type":"code","execution_count":24,"metadata":{},"outputs":[{"data":{"text/plain":["{\"geometry\": {\"x\": -10145683.671555312, \"y\": 3562570.141302621}, \"attributes\": {\"objectid\": 4, \"city_id\": 20, \"name\": \"Baton Rouge\", \"state\": \"LA\", \"capital\": \"State\", \"pop2000\": 227818, \"pop2007\": 228810, \"longitude\": -91.14022709999999, \"latitude\": 30.45809113, \"SHAPE\": {\"x\": -13680394.263900002, \"y\": 5949000.547900006, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}}}"]},"execution_count":24,"metadata":{},"output_type":"execute_result"}],"source":["# take a look at one of the features we created\n","features_to_be_added[0]"]},{"cell_type":"markdown","metadata":{},"source":["Thus, we have created a `list` of `Feature` objects with appropriate attributes and geometries. Next, to add these new features to the feature layer, call the `edit_features()` method of the `FeatureLayer` object and pass the list of `Feature` objects to the `adds` parameter:"]},{"cell_type":"code","execution_count":25,"metadata":{},"outputs":[{"data":{"text/plain":["{'addResults': [{'objectId': 20, 'globalId': None, 'success': True},\n"," {'objectId': 21, 'globalId': None, 'success': True},\n"," {'objectId': 22, 'globalId': None, 'success': True},\n"," {'objectId': 23, 'globalId': None, 'success': True},\n"," {'objectId': 24, 'globalId': None, 'success': True},\n"," {'objectId': 25, 'globalId': None, 'success': True},\n"," {'objectId': 26, 'globalId': None, 'success': True},\n"," {'objectId': 27, 'globalId': None, 'success': True},\n"," {'objectId': 28, 'globalId': None, 'success': True},\n"," {'objectId': 29, 'globalId': None, 'success': True},\n"," {'objectId': 30, 'globalId': None, 'success': True},\n"," {'objectId': 31, 'globalId': None, 'success': True},\n"," {'objectId': 32, 'globalId': None, 'success': True},\n"," {'objectId': 33, 'globalId': None, 'success': True},\n"," {'objectId': 34, 'globalId': None, 'success': True},\n"," {'objectId': 35, 'globalId': None, 'success': True},\n"," {'objectId': 36, 'globalId': None, 'success': True},\n"," {'objectId': 37, 'globalId': None, 'success': True},\n"," {'objectId': 38, 'globalId': None, 'success': True},\n"," {'objectId': 39, 'globalId': None, 'success': True},\n"," {'objectId': 40, 'globalId': None, 'success': True},\n"," {'objectId': 41, 'globalId': None, 'success': True},\n"," {'objectId': 42, 'globalId': None, 'success': True},\n"," {'objectId': 43, 'globalId': None, 'success': True},\n"," {'objectId': 44, 'globalId': None, 'success': True},\n"," {'objectId': 45, 'globalId': None, 'success': True},\n"," {'objectId': 46, 'globalId': None, 'success': True},\n"," {'objectId': 47, 'globalId': None, 'success': True},\n"," {'objectId': 48, 'globalId': None, 'success': True},\n"," {'objectId': 49, 'globalId': None, 'success': True},\n"," {'objectId': 50, 'globalId': None, 'success': True},\n"," {'objectId': 51, 'globalId': None, 'success': True}],\n"," 'updateResults': [],\n"," 'deleteResults': [],\n"," 'attachments': {'addResults': [], 'updateResults': [], 'deleteResults': []}}"]},"execution_count":25,"metadata":{},"output_type":"execute_result"}],"source":["cities_flayer.edit_features(adds = features_to_be_added)"]},{"cell_type":"markdown","metadata":{},"source":["Thus, we have successfully applied edits from second csv file. Next let us look at how we can apply edits from third csv file.\n","\n","### Apply edits from third spreadsheet\n","The next set of updates have arrived and are stored in `capitals_annex.csv`. We are told it contains additional columns for each of the features that we want to add to the feature layer.\n","\n","To start with, let us read the third csv file. Note in this sample, data is stored in csv. In reality, it could be from your enterprise database or any other data source."]},{"cell_type":"code","execution_count":26,"metadata":{},"outputs":[{"data":{"text/html":["
\n","\n","\n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n","
city_idnameclasswhiteblackameri_esasianhawn_plhispanicmalesfemales
01HonoluluCensus Designated Place7309360386892075882545716229182628189029
12JuneauCity and Borough229692483496143811610401546915242
23Boise CityCity17120414371300387030284109201493773
34OlympiaCity36246805553247312518632031922195
45SalemCity113746175020643304643199736875268172
\n","
"],"text/plain":[" city_id name class white black ameri_es \\\n","0 1 Honolulu Census Designated Place 73093 6038 689 \n","1 2 Juneau City and Borough 22969 248 3496 \n","2 3 Boise City City 171204 1437 1300 \n","3 4 Olympia City 36246 805 553 \n","4 5 Salem City 113746 1750 2064 \n","\n"," asian hawn_pl hispanic males females \n","0 207588 25457 16229 182628 189029 \n","1 1438 116 1040 15469 15242 \n","2 3870 302 8410 92014 93773 \n","3 2473 125 1863 20319 22195 \n","4 3304 643 19973 68752 68172 "]},"execution_count":26,"metadata":{},"output_type":"execute_result"}],"source":["# read the third csv set\n","csv3 = 'data/updating_gis_content/capitals_annex.csv'\n","cities_df_3 = pd.read_csv(csv3)\n","cities_df_3.head()"]},{"cell_type":"code","execution_count":27,"metadata":{},"outputs":[{"data":{"text/plain":["(51, 11)"]},"execution_count":27,"metadata":{},"output_type":"execute_result"}],"source":["#find the number of rows in the third csv\n","cities_df_3.shape"]},{"cell_type":"markdown","metadata":{},"source":["The `capitals_annex.csv` does not add new features, instead it adds additional attribute columns to existing features. It has 51 rows which were found to match the 19 + 32 rows from first and second csv files. The columns `City_ID` and `NAME` are common to all 3 spreadsheets. Next let us take a look at how we can append this additional attribute information to our feature layer.\n","\n","#### Inspecting existing fields of the feature layer\n","The `manager` property of the `FeatureLayer` object exposes a set of methods to read and update the properties and definition of feature layers."]},{"cell_type":"code","execution_count":28,"metadata":{},"outputs":[{"data":{"text/plain":["{\n"," \"name\": \"city_id\",\n"," \"type\": \"esriFieldTypeInteger\",\n"," \"alias\": \"city_id\",\n"," \"domain\": null,\n"," \"editable\": true,\n"," \"nullable\": true,\n"," \"sqlType\": \"sqlTypeInteger\",\n"," \"length\": 10\n","}"]},"execution_count":28,"metadata":{},"output_type":"execute_result"}],"source":["#Get the existing list of fields on the cities feature layer\n","cities_fields = cities_flayer.manager.properties.fields\n","\n","# Your feature layer may have multiple fields, \n","# instead of printing all, let us take a look at one of the fields:\n","cities_fields[1]"]},{"cell_type":"markdown","metadata":{},"source":["From above, we can see the representation of one of the fields. Let us loop through each of the fields and print the `name`, `alias`, `type` and `sqlType` properties"]},{"cell_type":"code","execution_count":29,"metadata":{},"outputs":[{"name":"stdout","output_type":"stream","text":["objectid | objectid | esriFieldTypeOID | sqlTypeInteger\n","city_id | city_id | esriFieldTypeInteger | sqlTypeInteger\n","name | name | esriFieldTypeString | sqlTypeVarchar\n","state | state | esriFieldTypeString | sqlTypeVarchar\n","capital | capital | esriFieldTypeString | sqlTypeVarchar\n","pop2000 | pop2000 | esriFieldTypeInteger | sqlTypeInteger\n","pop2007 | pop2007 | esriFieldTypeInteger | sqlTypeInteger\n","longitude | longitude | esriFieldTypeDouble | sqlTypeNumeric\n","latitude | latitude | esriFieldTypeDouble | sqlTypeNumeric\n"]}],"source":["for field in cities_fields:\n"," print(f\"{field.name:13}| {field.alias:13}| {field.type:25}| {field.sqlType}\")"]},{"cell_type":"markdown","metadata":{},"source":["#### Preparing additional columns to add to the feature layer\n","Now that we have an idea of how the fields are defined, we can go ahead and append new fields to the layer's definition. Once we compose a list of new fields, by calling the `add_to_definition()` method we can push those changes to the feature layer. Once the feature layer's definition is updated with new fields, we can loop through each feature and add the appropriate attribute values.\n","\n","To compose a list of new fields to be added, we start by making a copy of one of the fields as a template and start editing it. One easy part in this example is, all new fields that need to be added except one, are of the same data type: integer. With your data, this may not be the case. In such instances, you can add each field individually."]},{"cell_type":"code","execution_count":30,"metadata":{},"outputs":[{"data":{"text/plain":["{'name': 'city_id',\n"," 'type': 'esriFieldTypeInteger',\n"," 'alias': 'city_id',\n"," 'domain': None,\n"," 'editable': True,\n"," 'nullable': True,\n"," 'sqlType': 'sqlTypeInteger',\n"," 'length': 10}"]},"execution_count":30,"metadata":{},"output_type":"execute_result"}],"source":["# get a template field\n","template_field = dict(deepcopy(cities_fields[1]))\n","template_field"]},{"cell_type":"markdown","metadata":{},"source":["Let us use pandas to get the list of fields that are **new** in spread sheet 3"]},{"cell_type":"code","execution_count":31,"metadata":{},"outputs":[{"data":{"text/plain":["['ameri_es',\n"," 'asian',\n"," 'black',\n"," 'class',\n"," 'females',\n"," 'hawn_pl',\n"," 'hispanic',\n"," 'males',\n"," 'white']"]},"execution_count":31,"metadata":{},"output_type":"execute_result"}],"source":["# get the list of new fields to add from the third spreadsheet, that are not in spread sheets 1,2\n","new_field_names = list(cities_df_3.columns.difference(cities_df_1.columns))\n","new_field_names"]},{"cell_type":"markdown","metadata":{},"source":["Now loop though each new field name and create a field dictionary using the template we created earlier. Except the field titled `class` all other fields are of type `integer`."]},{"cell_type":"code","execution_count":32,"metadata":{},"outputs":[{"data":{"text/plain":["9"]},"execution_count":32,"metadata":{},"output_type":"execute_result"}],"source":["fields_to_be_added = []\n","for new_field_name in new_field_names:\n"," current_field = deepcopy(template_field)\n"," if new_field_name.lower() == 'class':\n"," current_field['sqlType'] = 'sqlTypeVarchar'\n"," current_field['type'] = 'esriFieldTypeString'\n"," current_field['length'] = 8000\n"," current_field['name'] = new_field_name.lower()\n"," current_field['alias'] = new_field_name\n"," fields_to_be_added.append(current_field)\n"," \n","len(fields_to_be_added)"]},{"cell_type":"code","execution_count":33,"metadata":{},"outputs":[{"data":{"text/plain":["{'name': 'class',\n"," 'type': 'esriFieldTypeString',\n"," 'alias': 'class',\n"," 'domain': None,\n"," 'editable': True,\n"," 'nullable': True,\n"," 'sqlType': 'sqlTypeVarchar',\n"," 'length': 8000}"]},"execution_count":33,"metadata":{},"output_type":"execute_result"}],"source":["#inspect one of the fields\n","fields_to_be_added[3]"]},{"cell_type":"markdown","metadata":{},"source":["#### Adding additional fields to the feature layer\n","The list of new fields we composed can be pushed to the server by calling `add_to_definition()` method on the `manager` property."]},{"cell_type":"code","execution_count":34,"metadata":{},"outputs":[{"data":{"text/plain":["{'success': True}"]},"execution_count":34,"metadata":{},"output_type":"execute_result"}],"source":["cities_flayer.manager.add_to_definition({'fields':fields_to_be_added})"]},{"cell_type":"markdown","metadata":{},"source":["Thus, we have successfully added new fields to our feature layer. Let us verify the new columns show up:"]},{"cell_type":"code","execution_count":35,"metadata":{},"outputs":[{"data":{"text/plain":["18"]},"execution_count":35,"metadata":{},"output_type":"execute_result"}],"source":["new_cities_fields = cities_flayer.manager.properties.fields\n","len(new_cities_fields)"]},{"cell_type":"code","execution_count":36,"metadata":{},"outputs":[{"name":"stdout","output_type":"stream","text":["objectid | esriFieldTypeOID\n","city_id | esriFieldTypeInteger\n","name | esriFieldTypeString\n","state | esriFieldTypeString\n","capital | esriFieldTypeString\n","pop2000 | esriFieldTypeInteger\n","pop2007 | esriFieldTypeInteger\n","longitude | esriFieldTypeDouble\n","latitude | esriFieldTypeDouble\n","ameri_es | esriFieldTypeInteger\n","asian | esriFieldTypeInteger\n","black | esriFieldTypeInteger\n","class | esriFieldTypeString\n","females | esriFieldTypeInteger\n","hawn_pl | esriFieldTypeInteger\n","hispanic | esriFieldTypeInteger\n","males | esriFieldTypeInteger\n","white | esriFieldTypeInteger\n"]}],"source":["for field in new_cities_fields:\n"," print(f\"{field.name:10}| {field.type}\")"]},{"cell_type":"markdown","metadata":{},"source":["#### Adding attribute values to the new columns\n","Next we can loop through each row in the third csv and add the new attribute values for these newly created columns."]},{"cell_type":"code","execution_count":37,"metadata":{},"outputs":[],"source":["# Run a fresh query on the feature layer so it includes the new features from\n","# csv2 and new columns from csv3\n","cities_fset2 = cities_flayer.query()\n","cities_features2 = cities_fset2.features"]},{"cell_type":"markdown","metadata":{},"source":["Loop through each row in the third spreadsheet, find the corresponding feature by matching the `city_id` value and apply the attribute values for the new fields."]},{"cell_type":"code","execution_count":38,"metadata":{},"outputs":[{"name":"stdout","output_type":"stream","text":["1 Adding additional attributes for: Honolulu\n","2 Adding additional attributes for: Juneau\n","3 Adding additional attributes for: Boise City\n","4 Adding additional attributes for: Olympia\n","5 Adding additional attributes for: Salem\n","6 Adding additional attributes for: Carson\n","7 Adding additional attributes for: Sacramento\n","8 Adding additional attributes for: Phoenix\n","9 Adding additional attributes for: Salt Lake City\n","10 Adding additional attributes for: Cheyenne\n","11 Adding additional attributes for: Denver\n","12 Adding additional attributes for: Santa Fe\n","13 Adding additional attributes for: Oklahoma City\n","14 Adding additional attributes for: Topeka\n","15 Adding additional attributes for: Lincoln\n","16 Adding additional attributes for: Des Moines\n","17 Adding additional attributes for: Jefferson City\n","18 Adding additional attributes for: Little Rock\n","19 Adding additional attributes for: Austin\n","20 Adding additional attributes for: Baton Rouge\n","21 Adding additional attributes for: Helena\n","22 Adding additional attributes for: Bismarck\n","23 Adding additional attributes for: Pierre\n","24 Adding additional attributes for: St. Paul\n","25 Adding additional attributes for: Madison\n","26 Adding additional attributes for: Lansing\n","27 Adding additional attributes for: Augusta\n","28 Adding additional attributes for: Montpelier\n","29 Adding additional attributes for: Albany\n","30 Adding additional attributes for: Boston\n","31 Adding additional attributes for: Concord\n","32 Adding additional attributes for: Tallahassee\n","33 Adding additional attributes for: Jackson\n","34 Adding additional attributes for: Nashville\n","35 Adding additional attributes for: Montgomery\n","36 Adding additional attributes for: Springfield\n","37 Adding additional attributes for: Indianapolis\n","38 Adding additional attributes for: Frankfort\n","39 Adding additional attributes for: Columbus\n","40 Adding additional attributes for: Charleston\n","41 Adding additional attributes for: Atlanta\n","42 Adding additional attributes for: Columbia\n","43 Adding additional attributes for: Raleigh\n","44 Adding additional attributes for: Richmond\n","45 Adding additional attributes for: Harrisburg\n","46 Adding additional attributes for: Trenton\n","47 Adding additional attributes for: Dover\n","48 Adding additional attributes for: Washington\n","49 Adding additional attributes for: Annapolis\n","50 Adding additional attributes for: Hartford\n","51 Adding additional attributes for: Providence\n"]}],"source":["features_for_update = []\n","for city_id in cities_df_3['city_id']:\n"," # get the matching row from csv\n"," matching_row = cities_df_3.where(cities_df_3.city_id == city_id).dropna()\n"," \n"," print(str(city_id) + \" Adding additional attributes for: \" + matching_row['name'].values[0])\n"," \n"," # get the feature to be updated\n"," original_feature = [f for f in cities_features2 if f.attributes['city_id'] == city_id][0]\n"," feature_to_be_updated = deepcopy(original_feature)\n"," \n"," # assign the updated values\n"," feature_to_be_updated.attributes['class'] = matching_row['class'].values[0]\n"," feature_to_be_updated.attributes['white'] = int(matching_row['white'])\n"," feature_to_be_updated.attributes['black'] = int(matching_row['black'])\n"," feature_to_be_updated.attributes['ameri_es'] = int(matching_row['ameri_es'])\n"," feature_to_be_updated.attributes['asian'] = int(matching_row['asian'])\n"," feature_to_be_updated.attributes['hawn_pl'] = int(matching_row['hawn_pl'])\n"," feature_to_be_updated.attributes['hispanic'] = int(matching_row['hispanic'])\n"," feature_to_be_updated.attributes['males'] = int(matching_row['males'])\n"," feature_to_be_updated.attributes['females'] = int(matching_row['females'])\n"," \n"," #add this to the list of features to be updated\n"," features_for_update.append(feature_to_be_updated)"]},{"cell_type":"code","execution_count":39,"metadata":{},"outputs":[{"data":{"text/plain":["{\"geometry\": {\"x\": -7950674.8190312665, \"y\": 5134585.3226086125}, \"attributes\": {\"objectid\": 51, \"city_id\": 51, \"name\": \"Providence\", \"state\": \"RI\", \"capital\": \"State\", \"pop2000\": 173618, \"pop2007\": 183731, \"longitude\": -71.42212709, \"latitude\": 41.82355466, \"ameri_es\": 1975, \"asian\": 10432, \"black\": 25243, \"class\": \"City\", \"females\": 90583, \"hawn_pl\": 270, \"hispanic\": 52146, \"males\": 83035, \"white\": 94666}}"]},"execution_count":39,"metadata":{},"output_type":"execute_result"}],"source":["# inspect one of the features\n","features_for_update[-1]"]},{"cell_type":"code","execution_count":40,"metadata":{},"outputs":[{"data":{"text/plain":["{'addResults': [],\n"," 'updateResults': [{'objectId': 1, 'globalId': None, 'success': True},\n"," {'objectId': 2, 'globalId': None, 'success': True},\n"," {'objectId': 3, 'globalId': None, 'success': True},\n"," {'objectId': 4, 'globalId': None, 'success': True},\n"," {'objectId': 5, 'globalId': None, 'success': True},\n"," {'objectId': 6, 'globalId': None, 'success': True},\n"," {'objectId': 7, 'globalId': None, 'success': True},\n"," {'objectId': 8, 'globalId': None, 'success': True},\n"," {'objectId': 9, 'globalId': None, 'success': True},\n"," {'objectId': 10, 'globalId': None, 'success': True},\n"," {'objectId': 11, 'globalId': None, 'success': True},\n"," {'objectId': 12, 'globalId': None, 'success': True},\n"," {'objectId': 13, 'globalId': None, 'success': True},\n"," {'objectId': 14, 'globalId': None, 'success': True},\n"," {'objectId': 15, 'globalId': None, 'success': True},\n"," {'objectId': 16, 'globalId': None, 'success': True},\n"," {'objectId': 17, 'globalId': None, 'success': True},\n"," {'objectId': 18, 'globalId': None, 'success': True},\n"," {'objectId': 19, 'globalId': None, 'success': True},\n"," {'objectId': 20, 'globalId': None, 'success': True},\n"," {'objectId': 21, 'globalId': None, 'success': True},\n"," {'objectId': 22, 'globalId': None, 'success': True},\n"," {'objectId': 23, 'globalId': None, 'success': True},\n"," {'objectId': 24, 'globalId': None, 'success': True},\n"," {'objectId': 25, 'globalId': None, 'success': True},\n"," {'objectId': 26, 'globalId': None, 'success': True},\n"," {'objectId': 27, 'globalId': None, 'success': True},\n"," {'objectId': 28, 'globalId': None, 'success': True},\n"," {'objectId': 29, 'globalId': None, 'success': True},\n"," {'objectId': 30, 'globalId': None, 'success': True},\n"," {'objectId': 31, 'globalId': None, 'success': True},\n"," {'objectId': 32, 'globalId': None, 'success': True},\n"," {'objectId': 33, 'globalId': None, 'success': True},\n"," {'objectId': 34, 'globalId': None, 'success': True},\n"," {'objectId': 35, 'globalId': None, 'success': True},\n"," {'objectId': 36, 'globalId': None, 'success': True},\n"," {'objectId': 37, 'globalId': None, 'success': True},\n"," {'objectId': 38, 'globalId': None, 'success': True},\n"," {'objectId': 39, 'globalId': None, 'success': True},\n"," {'objectId': 40, 'globalId': None, 'success': True},\n"," {'objectId': 41, 'globalId': None, 'success': True},\n"," {'objectId': 42, 'globalId': None, 'success': True},\n"," {'objectId': 43, 'globalId': None, 'success': True},\n"," {'objectId': 44, 'globalId': None, 'success': True},\n"," {'objectId': 45, 'globalId': None, 'success': True},\n"," {'objectId': 46, 'globalId': None, 'success': True},\n"," {'objectId': 47, 'globalId': None, 'success': True},\n"," {'objectId': 48, 'globalId': None, 'success': True},\n"," {'objectId': 49, 'globalId': None, 'success': True},\n"," {'objectId': 50, 'globalId': None, 'success': True},\n"," {'objectId': 51, 'globalId': None, 'success': True}],\n"," 'deleteResults': [],\n"," 'attachments': {'addResults': [], 'updateResults': [], 'deleteResults': []}}"]},"execution_count":40,"metadata":{},"output_type":"execute_result"}],"source":["# apply the edits to the feature layer\n","cities_flayer.edit_features(updates= features_for_update)"]},{"cell_type":"markdown","metadata":{},"source":["#### Verify the changes made so far\n","Let us run another query on the feature layer and visualize a few rows."]},{"cell_type":"code","execution_count":41,"metadata":{},"outputs":[{"data":{"text/html":["
\n","\n","\n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n","
SHAPEameri_esasianblackcapitalcity_idclassfemaleshawn_plhispaniclatitudelongitudemalesnameobjectidpop2000pop2007statewhite
0{\"x\": -17568824.553, \"y\": 2428377.352700006, \"...6892075886038State1Census Designated Place189029254571622921.305782-157.823436182628Honolulu1371657378587HI73093
1{\"x\": -14973760.769500002, \"y\": 8041504.674200...34961438248State2City and Borough15242116104058.351418-134.51158215469Juneau23071131592AK22969
2{\"x\": -12939516.521100001, \"y\": 5405860.248099...130038701437State3City93773302841043.613736-116.23765592014Boise City3185787203529ID171204
3{\"x\": -13680394.263852797, \"y\": 5949000.547924...5532473805State4City22195125186347.042418-122.89307320319Olympia44251445523WA36246
4{\"x\": -13695542.842799995, \"y\": 5610682.544100...206433041750State5City681726431997344.931109-123.02915568752Salem5136924152039OR113746
\n","
"],"text/plain":[" SHAPE ameri_es asian black \\\n","0 {\"x\": -17568824.553, \"y\": 2428377.352700006, \"... 689 207588 6038 \n","1 {\"x\": -14973760.769500002, \"y\": 8041504.674200... 3496 1438 248 \n","2 {\"x\": -12939516.521100001, \"y\": 5405860.248099... 1300 3870 1437 \n","3 {\"x\": -13680394.263852797, \"y\": 5949000.547924... 553 2473 805 \n","4 {\"x\": -13695542.842799995, \"y\": 5610682.544100... 2064 3304 1750 \n","\n"," capital city_id class females hawn_pl hispanic \\\n","0 State 1 Census Designated Place 189029 25457 16229 \n","1 State 2 City and Borough 15242 116 1040 \n","2 State 3 City 93773 302 8410 \n","3 State 4 City 22195 125 1863 \n","4 State 5 City 68172 643 19973 \n","\n"," latitude longitude males name objectid pop2000 pop2007 \\\n","0 21.305782 -157.823436 182628 Honolulu 1 371657 378587 \n","1 58.351418 -134.511582 15469 Juneau 2 30711 31592 \n","2 43.613736 -116.237655 92014 Boise City 3 185787 203529 \n","3 47.042418 -122.893073 20319 Olympia 4 42514 45523 \n","4 44.931109 -123.029155 68752 Salem 5 136924 152039 \n","\n"," state white \n","0 HI 73093 \n","1 AK 22969 \n","2 ID 171204 \n","3 WA 36246 \n","4 OR 113746 "]},"execution_count":41,"metadata":{},"output_type":"execute_result"}],"source":["cities_fset3 = cities_flayer.query()\n","cities_fset3.sdf.head(5)"]},{"cell_type":"markdown","metadata":{},"source":["## Conclusion\n","In this sample, we observed an edit intensive method to keep feature layers updated. We published data from first spreadsheet as a feature layer. We then updated existing features from second spread sheet (used geometry module to project the coordinates in the process), and added new features. The third spreadsheet presented additional attribute columns which were added to the feature layer by editing its definition and then updating the features with this additional data.\n","\n","This method is editing intensive and you may choose this when the number of features to edit is less or if you needed to selectively update certain features as updates come in.\n","\n","An alternate method is to overwrite the feature layer altogether when you always have current information coming in. This method is explained in the sample [Overwriting feature layers](https://developers.arcgis.com/python/sample-notebooks/overwriting-feature-layers)"]}],"metadata":{"anaconda-cloud":{},"esriNotebookRuntime":{"notebookRuntimeName":"ArcGIS Notebook Python 3 Standard","notebookRuntimeVersion":"4.0"},"kernelspec":{"display_name":"Python 3","language":"python","name":"python3"},"language_info":{"codemirror_mode":{"name":"ipython","version":3},"file_extension":".py","mimetype":"text/x-python","name":"python","nbconvert_exporter":"python","pygments_lexer":"ipython3","version":"3.8.2"},"toc":{"base_numbering":1,"nav_menu":{},"number_sections":true,"sideBar":true,"skip_h1_title":false,"title_cell":"Table of Contents","title_sidebar":"Contents","toc_cell":true,"toc_position":{},"toc_section_display":true,"toc_window_display":true}},"nbformat":4,"nbformat_minor":1} +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Updating features in a feature layer\n", + "\n", + "As content publishers, you may be required to keep certain web layers up to date. As new data arrives, you may have to append new features, update existing features etc. There are a couple of different options to accomplish this:\n", + " \n", + " - Method 1: editing individual features as updated datasets are available\n", + " - Method 2: overwriting feature layers altogether with updated datasets\n", + " \n", + "Depending on the number of features that are updated, your workflow requirements, you may adopt either or both kinds of update mechanisms.\n", + "\n", + "In this sample, we explore the first method. For **Method 2**, refer to the sample titled [Overwriting feature layers](https://developers.arcgis.com/python/sample-notebooks/overwriting-feature-layers)\n", + "\n", + "**Note**: To run this sample, you need the ``pandas`` library in your conda environment. If you don't have the library, install it by running the following command from cmd.exe or your shell\n", + "```\n", + "conda install pandas```" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "toc": true + }, + "source": [ + "

Table of Contents

\n", + "" + ] + }, + { + "cell_type": "code", + "execution_count": 3, + "metadata": {}, + "outputs": [], + "source": [ + "# Connect to the GIS\n", + "from arcgis.gis import GIS\n", + "from arcgis import features\n", + "import pandas as pd" + ] + }, + { + "cell_type": "code", + "execution_count": 4, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Enter password: ········\n" + ] + } + ], + "source": [ + "#Access the portal using \"amazing_arcgis_123\" as password for the given Username. \n", + "gis = GIS(\"https://pythonapi.playground.esri.com/portal\")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Updating feature layer by editing individual features\n", + "\n", + "Let us consider a scenario where we need to update a feature layer containing the capital cities of the US. We have 3 csv datasets simulating an update workflow as described below:\n", + "\n", + " 1. capitals_1.csv -- contains the initial, incomplete dataset\n", + " 2. capitals_2.csv -- contains additional points and updates to existing points, building on top of capitals_1.csv\n", + " 3. capitals_annex.csv -- an alternate table containing additional attribute information\n", + " \n", + "Our goal is to update the feature layer with each of these datasets doing the necessary edit operations." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Publish the cities feature layer using the initial dataset" + ] + }, + { + "cell_type": "code", + "execution_count": 5, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "
\n", + "\n", + "\n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + "
city_idnamestatecapitalpop2000pop2007longitudelatitude
01HonoluluHIState371657378587-157.82343621.305782
12JuneauAKState3071131592-134.51158258.351418
23Boise CityIDState185787203529-116.23765543.613736
34OlympiaWAState2751445523-122.89307347.042418
45SalemORState136924152039-123.02915544.931109
\n", + "
" + ], + "text/plain": [ + " city_id name state capital pop2000 pop2007 longitude latitude\n", + "0 1 Honolulu HI State 371657 378587 -157.823436 21.305782\n", + "1 2 Juneau AK State 30711 31592 -134.511582 58.351418\n", + "2 3 Boise City ID State 185787 203529 -116.237655 43.613736\n", + "3 4 Olympia WA State 27514 45523 -122.893073 47.042418\n", + "4 5 Salem OR State 136924 152039 -123.029155 44.931109" + ] + }, + "execution_count": 5, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "# read the initial csv\n", + "csv1 = 'data/updating_gis_content/capitals_1.csv'\n", + "cities_df_1 = pd.read_csv(csv1)\n", + "cities_df_1.head()" + ] + }, + { + "cell_type": "code", + "execution_count": 6, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "(19, 8)" + ] + }, + "execution_count": 6, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "# print the number of records in this csv\n", + "cities_df_1.shape" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "As you can see, this dataset only contains 19 rows or 19 capital cities. It is not the complete dataset." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let's add this `csv` as a portal item. Adding the item creates a CSV item and uploads the original file to the portal, establishing a link between the item and the original file name. Therefore, we need a unique name for the file to guarantee it does not collide with any file of the same name that may have been uploaded by the same user. We'll use standard library modules to copy the file and give it a new name so we can add it to the portal" + ] + }, + { + "cell_type": "code", + "execution_count": 7, + "metadata": {}, + "outputs": [], + "source": [ + "import os\n", + "import datetime as dt\n", + "import shutil\n", + "\n", + "# assign variables to locations on the file system \n", + "cwd = os.path.abspath(os.getcwd())\n", + "data_pth = os.path.join(cwd, r'data/updating_gis_content/')\n", + "\n", + "# create a unique timestamp string to append to the file name\n", + "now_ts = str(int(dt.datetime.now().timestamp()))\n", + "\n", + "# copy the file, appending the unique string and assign it to a variable\n", + "my_csv = shutil.copyfile(os.path.abspath(csv1),\n", + " os.path.join(data_pth, 'capitals_1_' + now_ts + '.csv'))" + ] + }, + { + "cell_type": "code", + "execution_count": 8, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "'\\\\path\\\\on\\\\computer\\\\05_content_publishers\\\\data/updating_gis_content/capitals_1_1548875507.csv'" + ] + }, + "execution_count": 8, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "my_csv" + ] + }, + { + "cell_type": "code", + "execution_count": 9, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "
\n", + "
\n", + " \n", + " \n", + " \n", + "
\n", + "\n", + "
\n", + " USA Capitals spreadsheet 1548875507\n", + " \n", + "
CSV by arcgis_python\n", + "
Last Modified: January 30, 2019\n", + "
0 comments, 0 views\n", + "
\n", + "
\n", + " " + ], + "text/plain": [ + "" + ] + }, + "execution_count": 9, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "# add the initial csv file and publish that as a web layer\n", + "item_prop = {'title':'USA Capitals spreadsheet ' + now_ts, \"type\": \"CSV\"}\n", + "root_folder = gis.content.folders.get()\n", + "csv_item = root_folder.add(item_properties=item_prop, file=my_csv).result()\n", + "csv_item" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This spreadsheet has co-ordinates as `latitude` and `longitude` columns which will be used for geometries during publishing." + ] + }, + { + "cell_type": "code", + "execution_count": 10, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "
\n", + "
\n", + " \n", + " \n", + " \n", + "
\n", + "\n", + "
\n", + " USA Capitals spreadsheet 1548875507\n", + " \n", + "
Feature Layer Collection by arcgis_python\n", + "
Last Modified: January 30, 2019\n", + "
0 comments, 0 views\n", + "
\n", + "
\n", + " " + ], + "text/plain": [ + "" + ] + }, + "execution_count": 10, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "# publish the csv item into a feature layer\n", + "cities_item = csv_item.publish()\n", + "cities_item" + ] + }, + { + "cell_type": "code", + "execution_count": 19, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "
\n", + "
\n", + " \n", + " \n", + " \n", + "
\n", + "\n", + "
\n", + " USA Capitals\n", + " \n", + "
Feature Layer Collection by arcgis_python\n", + "
Last Modified: January 30, 2019\n", + "
0 comments, 0 views\n", + "
\n", + "
\n", + " " + ], + "text/plain": [ + "" + ] + }, + "execution_count": 19, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "# update the item metadata\n", + "item_prop = {'title':'USA Capitals'}\n", + "cities_item.update(item_properties = item_prop, thumbnail='data/updating_gis_content/capital_cities.png')\n", + "cities_item" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Apply updates from the second spreadsheet\n", + "The next set of updates have arrived and are stored in `capitals_2.csv`. We are told it contains corrections for the original set in addition to new features. We need to figure out which rows have changed, apply `update` operation on those, then apply `add` operation to new rows.\n", + "\n", + "To start with, let us read the second csv file. Note, in this sample, data is stored in csv. In reality, it could be from your enterprise database or any other data source." + ] + }, + { + "cell_type": "code", + "execution_count": 11, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "
\n", + "\n", + "\n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + "
city_idnamestatecapitalpop2000pop2007longitudelatitude
020Baton RougeLAState227818228810-91.14022730.458091
121HelenaMTState2578026007-112.02702746.595809
222BismarckNDState5553259344-100.77900046.813346
323PierreSDState1387614169-100.33638244.367964
424St. PaulMNState287151291643-93.11411844.954364
\n", + "
" + ], + "text/plain": [ + " city_id name state capital pop2000 pop2007 longitude latitude\n", + "0 20 Baton Rouge LA State 227818 228810 -91.140227 30.458091\n", + "1 21 Helena MT State 25780 26007 -112.027027 46.595809\n", + "2 22 Bismarck ND State 55532 59344 -100.779000 46.813346\n", + "3 23 Pierre SD State 13876 14169 -100.336382 44.367964\n", + "4 24 St. Paul MN State 287151 291643 -93.114118 44.954364" + ] + }, + "execution_count": 11, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "# read the second csv set\n", + "csv2 = 'data/updating_gis_content/capitals_2.csv'\n", + "cities_df_2 = pd.read_csv(csv2)\n", + "cities_df_2.head()" + ] + }, + { + "cell_type": "code", + "execution_count": 12, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "(36, 8)" + ] + }, + "execution_count": 12, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "# get the dimensions of this csv\n", + "cities_df_2.shape" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Identifying existing features that need to be updated\n", + "\n", + "To identify features that need to be updated, let us read the attribute table of the published feature layer and compare that against the second csv. To read the attribute table, we perform a `query()` on the feature layer which returns us an `arcgis.feature.FeatureSet` object. Refer to the guide pages on [accessing features from feature layers](https://developers.arcgis.com/python/guide/working-with-feature-layers-and-features/) to learn more about this.\n", + "\n", + "Note, at this point, we could work with the `cities_df_1` dataframe we created from the original csv file. However, in practice you may not always have the original dataset or your feature layer might have undergone edits after it was published. Hence, we query the feature layer directly." + ] + }, + { + "cell_type": "code", + "execution_count": 13, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "
\n", + "\n", + "\n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + "
SHAPEcapitalcity_idlatitudelongitudenameobjectidpop2000pop2007state
0{\"x\": -17568824.553, \"y\": 2428377.352700006, \"...State121.305782-157.823436Honolulu1371657378587HI
1{\"x\": -14973760.769500002, \"y\": 8041504.674200...State258.351418-134.511582Juneau23071131592AK
2{\"x\": -12939516.521100001, \"y\": 5405860.248099...State343.613736-116.237655Boise City3185787203529ID
3{\"x\": -13680394.263900002, \"y\": 5949000.547900...State447.042418-122.893073Olympia42751445523WA
4{\"x\": -13695542.842799995, \"y\": 5610682.544100...State544.931109-123.029155Salem5136924152039OR
\n", + "
" + ], + "text/plain": [ + " SHAPE capital city_id \\\n", + "0 {\"x\": -17568824.553, \"y\": 2428377.352700006, \"... State 1 \n", + "1 {\"x\": -14973760.769500002, \"y\": 8041504.674200... State 2 \n", + "2 {\"x\": -12939516.521100001, \"y\": 5405860.248099... State 3 \n", + "3 {\"x\": -13680394.263900002, \"y\": 5949000.547900... State 4 \n", + "4 {\"x\": -13695542.842799995, \"y\": 5610682.544100... State 5 \n", + "\n", + " latitude longitude name objectid pop2000 pop2007 state \n", + "0 21.305782 -157.823436 Honolulu 1 371657 378587 HI \n", + "1 58.351418 -134.511582 Juneau 2 30711 31592 AK \n", + "2 43.613736 -116.237655 Boise City 3 185787 203529 ID \n", + "3 47.042418 -122.893073 Olympia 4 27514 45523 WA \n", + "4 44.931109 -123.029155 Salem 5 136924 152039 OR " + ] + }, + "execution_count": 13, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "cities_flayer = cities_item.layers[0]\n", + "cities_fset = cities_flayer.query() #querying without any conditions returns all the features\n", + "cities_fset.sdf.head()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The `city_id` column is common between both the datasets. Next, let us perform an `inner` join with the table from feature layer as left and updated csv as right. Inner joins will yield those rows that are present in both tables. Learn more about [inner joins here](https://www.w3schools.com/sql/sql_join_inner.asp)." + ] + }, + { + "cell_type": "code", + "execution_count": 14, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "
\n", + "\n", + "\n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + "
SHAPEcapital_xcity_idlatitude_xlongitude_xname_xobjectidpop2000_xpop2007_xstate_xname_ystate_ycapital_ypop2000_ypop2007_ylongitude_ylatitude_y
0{'x': -13680394.263900002, 'y': 5949000.547900...State447.042418-122.893073Olympia42751445523WAOlympiaWAState4251445523-122.89307347.042418
1{'x': 11666510.350300007, 'y': -5033833.302499...State10-41.145545104.802046Cheyenne105301154750WYCheyenneWYState5301154750-104.80204641.145545
2{'x': -10857545.543799996, 'y': 4229619.674200...State1335.482309-97.534991Oklahoma City13506132552556OKKOklahoma CityOKState506132552556-97.53499135.482309
3{'x': -10421826.864700003, 'y': 5099899.263700...State1641.590936-93.620864Des Moines16200682201257IADes MoinesIAState198682201257-93.62086441.590936
\n", + "
" + ], + "text/plain": [ + " SHAPE capital_x city_id \\\n", + "0 {'x': -13680394.263900002, 'y': 5949000.547900... State 4 \n", + "1 {'x': 11666510.350300007, 'y': -5033833.302499... State 10 \n", + "2 {'x': -10857545.543799996, 'y': 4229619.674200... State 13 \n", + "3 {'x': -10421826.864700003, 'y': 5099899.263700... State 16 \n", + "\n", + " latitude_x longitude_x name_x objectid pop2000_x pop2007_x \\\n", + "0 47.042418 -122.893073 Olympia 4 27514 45523 \n", + "1 -41.145545 104.802046 Cheyenne 10 53011 54750 \n", + "2 35.482309 -97.534991 Oklahoma City 13 506132 552556 \n", + "3 41.590936 -93.620864 Des Moines 16 200682 201257 \n", + "\n", + " state_x name_y state_y capital_y pop2000_y pop2007_y longitude_y \\\n", + "0 WA Olympia WA State 42514 45523 -122.893073 \n", + "1 WY Cheyenne WY State 53011 54750 -104.802046 \n", + "2 OKK Oklahoma City OK State 506132 552556 -97.534991 \n", + "3 IA Des Moines IA State 198682 201257 -93.620864 \n", + "\n", + " latitude_y \n", + "0 47.042418 \n", + "1 41.145545 \n", + "2 35.482309 \n", + "3 41.590936 " + ] + }, + "execution_count": 14, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "overlap_rows = pd.merge(left = cities_fset.sdf, right = cities_df_2, how='inner',\n", + " on = 'city_id')\n", + "overlap_rows" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Thus, of 19 features in original and 36 features in second csv, 4 features are common. Inspecting the table, we find certain columns are updated, for instance, Cheyenne has its coordinates corrected, Oklahoma City has its state abbreviation corrected and similarly other cities have one of their attribute columns updated.\n", + "\n", + "We could either update individual attribute values for these 4 features or update all attribute values with the latest csv. Below, we are performing the latter as it is simple and fast.\n", + "\n", + "#### Perform updates to the existing features" + ] + }, + { + "cell_type": "code", + "execution_count": 15, + "metadata": {}, + "outputs": [], + "source": [ + "features_for_update = [] #list containing corrected features\n", + "all_features = cities_fset.features" + ] + }, + { + "cell_type": "code", + "execution_count": 16, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "{\"geometry\": {\"x\": -17568824.553, \"y\": 2428377.352700006, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}, \"attributes\": {\"objectid\": 1, \"city_id\": 1, \"name\": \"Honolulu\", \"state\": \"HI\", \"capital\": \"State\", \"pop2000\": 371657, \"pop2007\": 378587, \"longitude\": -157.8234362, \"latitude\": 21.30578163, \"SHAPE\": {\"x\": -17568824.553, \"y\": 2428377.352700006, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}}}" + ] + }, + "execution_count": 16, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "# inspect one of the features\n", + "all_features[0]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Note the X and Y geometry values are different from decimal degree coordinates present in Longitude and Latitude fields. To perform geometry edits, we need to project the coordinates to match that of the feature layer." + ] + }, + { + "cell_type": "code", + "execution_count": 17, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "{'wkid': 102100, 'latestWkid': 3857}" + ] + }, + "execution_count": 17, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "# get the spatial reference of the features since we need to update the geometry\n", + "cities_fset.spatial_reference" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Below, we prepare updated geometries and attributes for each of the 4 features we determined above. We use the `arcgis.geometry` module to `project` the coordinates from geographic to projected coordinate system. The cell below prints the original `Feature` objects followed by the updated ones. If you look closely, you can find the differences." + ] + }, + { + "cell_type": "code", + "execution_count": 18, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "{\"geometry\": {\"x\": -13680394.263900002, \"y\": 5949000.547900006, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}, \"attributes\": {\"objectid\": 4, \"city_id\": 4, \"name\": \"Olympia\", \"state\": \"WA\", \"capital\": \"State\", \"pop2000\": 27514, \"pop2007\": 45523, \"longitude\": -122.8930726, \"latitude\": 47.04241817, \"SHAPE\": {\"x\": -13680394.263900002, \"y\": 5949000.547900006, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}}}\n", + "{\"geometry\": {\"x\": -13680394.2638528, \"y\": 5949000.54792491}, \"attributes\": {\"objectid\": 4, \"city_id\": 4, \"name\": \"Olympia\", \"state\": \"WA\", \"capital\": \"State\", \"pop2000\": 42514, \"pop2007\": 45523, \"longitude\": -122.8930726, \"latitude\": 47.04241817, \"SHAPE\": {\"x\": -13680394.263900002, \"y\": 5949000.547900006, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}}}\n", + "========================================================================\n", + "{\"geometry\": {\"x\": 11666510.350300007, \"y\": -5033833.302499998, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}, \"attributes\": {\"objectid\": 10, \"city_id\": 10, \"name\": \"Cheyenne\", \"state\": \"WY\", \"capital\": \"State\", \"pop2000\": 53011, \"pop2007\": 54750, \"longitude\": 104.8020456, \"latitude\": -41.14554516, \"SHAPE\": {\"x\": 11666510.350300007, \"y\": -5033833.302499998, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}}}\n", + "{\"geometry\": {\"x\": -11666510.350285435, \"y\": 5033833.302497153}, \"attributes\": {\"objectid\": 10, \"city_id\": 10, \"name\": \"Cheyenne\", \"state\": \"WY\", \"capital\": \"State\", \"pop2000\": 53011, \"pop2007\": 54750, \"longitude\": -104.80204559999999, \"latitude\": 41.14554516, \"SHAPE\": {\"x\": 11666510.350300007, \"y\": -5033833.302499998, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}}}\n", + "========================================================================\n", + "{\"geometry\": {\"x\": -10857545.543799996, \"y\": 4229619.674200006, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}, \"attributes\": {\"objectid\": 13, \"city_id\": 13, \"name\": \"Oklahoma City\", \"state\": \"OKK\", \"capital\": \"State\", \"pop2000\": 506132, \"pop2007\": 552556, \"longitude\": -97.5349911, \"latitude\": 35.48230867, \"SHAPE\": {\"x\": -10857545.543799996, \"y\": 4229619.674200006, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}}}\n", + "{\"geometry\": {\"x\": -10857545.54377847, \"y\": 4229619.674165817}, \"attributes\": {\"objectid\": 13, \"city_id\": 13, \"name\": \"Oklahoma City\", \"state\": \"OK\", \"capital\": \"State\", \"pop2000\": 506132, \"pop2007\": 552556, \"longitude\": -97.5349911, \"latitude\": 35.48230867, \"SHAPE\": {\"x\": -10857545.543799996, \"y\": 4229619.674200006, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}}}\n", + "========================================================================\n", + "{\"geometry\": {\"x\": -10421826.864700003, \"y\": 5099899.263700008, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}, \"attributes\": {\"objectid\": 16, \"city_id\": 16, \"name\": \"Des Moines\", \"state\": \"IA\", \"capital\": \"State\", \"pop2000\": 200682, \"pop2007\": 201257, \"longitude\": -93.62086361, \"latitude\": 41.59093617, \"SHAPE\": {\"x\": -10421826.864700003, \"y\": 5099899.263700008, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}}}\n", + "{\"geometry\": {\"x\": -10421826.864691716, \"y\": 5099899.263692743}, \"attributes\": {\"objectid\": 16, \"city_id\": 16, \"name\": \"Des Moines\", \"state\": \"IA\", \"capital\": \"State\", \"pop2000\": 198682, \"pop2007\": 201257, \"longitude\": -93.62086361, \"latitude\": 41.59093617, \"SHAPE\": {\"x\": -10421826.864700003, \"y\": 5099899.263700008, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}}}\n", + "========================================================================\n" + ] + } + ], + "source": [ + "from arcgis import geometry #use geometry module to project Long,Lat to X and Y\n", + "from copy import deepcopy\n", + "\n", + "for city_id in overlap_rows['city_id']:\n", + " # get the feature to be updated\n", + " original_feature = [f for f in all_features if f.attributes['city_id'] == city_id][0]\n", + " feature_to_be_updated = deepcopy(original_feature)\n", + " \n", + " print(str(original_feature))\n", + " \n", + " # get the matching row from csv\n", + " matching_row = cities_df_2.where(cities_df_2.city_id == city_id).dropna()\n", + " \n", + " #get geometries in the destination coordinate system\n", + " input_geometry = {'y':float(matching_row['latitude']),\n", + " 'x':float(matching_row['longitude'])}\n", + " output_geometry = geometry.project(geometries = [input_geometry],\n", + " in_sr = 4326, \n", + " out_sr = cities_fset.spatial_reference['latestWkid'],\n", + " gis = gis)\n", + " \n", + " # assign the updated values\n", + " feature_to_be_updated.geometry = output_geometry[0]\n", + " feature_to_be_updated.attributes['longitude'] = float(matching_row['longitude'])\n", + " feature_to_be_updated.attributes['city_id'] = int(matching_row['city_id'])\n", + " feature_to_be_updated.attributes['state'] = matching_row['state'].values[0]\n", + " feature_to_be_updated.attributes['capital'] = matching_row['capital'].values[0]\n", + " feature_to_be_updated.attributes['latitude'] = float(matching_row['latitude'])\n", + " feature_to_be_updated.attributes['name'] = matching_row['name'].values[0]\n", + " feature_to_be_updated.attributes['pop2000'] = int(matching_row['pop2000'])\n", + " feature_to_be_updated.attributes['pop2007'] = int(matching_row['pop2007'])\n", + " \n", + " #add this to the list of features to be updated\n", + " features_for_update.append(feature_to_be_updated)\n", + " \n", + " print(str(feature_to_be_updated))\n", + " print(\"========================================================================\")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We have constructed a list of features with updated values. We can use this list to perform updates on the feature layer." + ] + }, + { + "cell_type": "code", + "execution_count": 19, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "[{\"geometry\": {\"x\": -13680394.2638528, \"y\": 5949000.54792491}, \"attributes\": {\"objectid\": 4, \"city_id\": 4, \"name\": \"Olympia\", \"state\": \"WA\", \"capital\": \"State\", \"pop2000\": 42514, \"pop2007\": 45523, \"longitude\": -122.8930726, \"latitude\": 47.04241817, \"SHAPE\": {\"x\": -13680394.263900002, \"y\": 5949000.547900006, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}}},\n", + " {\"geometry\": {\"x\": -11666510.350285435, \"y\": 5033833.302497153}, \"attributes\": {\"objectid\": 10, \"city_id\": 10, \"name\": \"Cheyenne\", \"state\": \"WY\", \"capital\": \"State\", \"pop2000\": 53011, \"pop2007\": 54750, \"longitude\": -104.80204559999999, \"latitude\": 41.14554516, \"SHAPE\": {\"x\": 11666510.350300007, \"y\": -5033833.302499998, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}}},\n", + " {\"geometry\": {\"x\": -10857545.54377847, \"y\": 4229619.674165817}, \"attributes\": {\"objectid\": 13, \"city_id\": 13, \"name\": \"Oklahoma City\", \"state\": \"OK\", \"capital\": \"State\", \"pop2000\": 506132, \"pop2007\": 552556, \"longitude\": -97.5349911, \"latitude\": 35.48230867, \"SHAPE\": {\"x\": -10857545.543799996, \"y\": 4229619.674200006, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}}},\n", + " {\"geometry\": {\"x\": -10421826.864691716, \"y\": 5099899.263692743}, \"attributes\": {\"objectid\": 16, \"city_id\": 16, \"name\": \"Des Moines\", \"state\": \"IA\", \"capital\": \"State\", \"pop2000\": 198682, \"pop2007\": 201257, \"longitude\": -93.62086361, \"latitude\": 41.59093617, \"SHAPE\": {\"x\": -10421826.864700003, \"y\": 5099899.263700008, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}}}]" + ] + }, + "execution_count": 19, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "features_for_update" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "To update the feature layer, call the `edit_features()` method of the `FeatureLayer` object and pass the list of features to the `updates` parameter:" + ] + }, + { + "cell_type": "code", + "execution_count": 20, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "{'addResults': [],\n", + " 'updateResults': [{'objectId': 4, 'globalId': None, 'success': True},\n", + " {'objectId': 10, 'globalId': None, 'success': True},\n", + " {'objectId': 13, 'globalId': None, 'success': True},\n", + " {'objectId': 16, 'globalId': None, 'success': True}],\n", + " 'deleteResults': [],\n", + " 'attachments': {'addResults': [], 'updateResults': [], 'deleteResults': []}}" + ] + }, + "execution_count": 20, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "cities_flayer.edit_features(updates= features_for_update)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We have successfully applied corrections to those features which existed in the feature layer from the initial dataset. Next let us proceed to adding new features present only in the second csv file.\n", + "\n", + "#### Identifying new features that need to be added" + ] + }, + { + "cell_type": "code", + "execution_count": 21, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "(32, 8)\n" + ] + } + ], + "source": [ + "#select those rows in the capitals_2.csv that do not overlap with those in capitals_1.csv\n", + "new_rows = cities_df_2[~cities_df_2['city_id'].isin(overlap_rows['city_id'])]\n", + "print(new_rows.shape)" + ] + }, + { + "cell_type": "code", + "execution_count": 22, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "
\n", + "\n", + "\n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + "
city_idnamestatecapitalpop2000pop2007longitudelatitude
020Baton RougeLAState227818228810-91.14022730.458091
121HelenaMTState2578026007-112.02702746.595809
222BismarckNDState5553259344-100.77900046.813346
323PierreSDState1387614169-100.33638244.367964
424St. PaulMNState287151291643-93.11411844.954364
\n", + "
" + ], + "text/plain": [ + " city_id name state capital pop2000 pop2007 longitude latitude\n", + "0 20 Baton Rouge LA State 227818 228810 -91.140227 30.458091\n", + "1 21 Helena MT State 25780 26007 -112.027027 46.595809\n", + "2 22 Bismarck ND State 55532 59344 -100.779000 46.813346\n", + "3 23 Pierre SD State 13876 14169 -100.336382 44.367964\n", + "4 24 St. Paul MN State 287151 291643 -93.114118 44.954364" + ] + }, + "execution_count": 22, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "new_rows.head()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Thus, of the total 36 rows in the second csv, we have determined the 32 other rows which are new and need to be appended as new features." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Adding new features\n", + "Next, let us compose another `list` of `Feature` objects similar to earlier, from the `new_rows` data frame." + ] + }, + { + "cell_type": "code", + "execution_count": 23, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Creating Baton Rouge\n", + "Creating Helena\n", + "Creating Bismarck\n", + "Creating Pierre\n", + "Creating St. Paul\n", + "Creating Madison\n", + "Creating Lansing\n", + "Creating Augusta\n", + "Creating Montpelier\n", + "Creating Albany\n", + "Creating Boston\n", + "Creating Concord\n", + "Creating Tallahassee\n", + "Creating Jackson\n", + "Creating Nashville\n", + "Creating Montgomery\n", + "Creating Springfield\n", + "Creating Indianapolis\n", + "Creating Frankfort\n", + "Creating Columbus\n", + "Creating Charleston\n", + "Creating Atlanta\n", + "Creating Columbia\n", + "Creating Raleigh\n", + "Creating Richmond\n", + "Creating Harrisburg\n", + "Creating Trenton\n", + "Creating Dover\n", + "Creating Washington\n", + "Creating Annapolis\n", + "Creating Hartford\n", + "Creating Providence\n" + ] + } + ], + "source": [ + "features_to_be_added = []\n", + "\n", + "# get a template feature object\n", + "template_feature = deepcopy(features_for_update[0])\n", + "\n", + "# loop through each row and add to the list of features to be added\n", + "for row in new_rows.iterrows():\n", + " new_feature = deepcopy(template_feature)\n", + " \n", + " #print\n", + " print(\"Creating \" + row[1]['name'])\n", + " \n", + " #get geometries in the destination coordinate system\n", + " input_geometry = {'y':float(row[1]['latitude']),\n", + " 'x':float(row[1]['longitude'])}\n", + " output_geometry = geometry.project(geometries = [input_geometry],\n", + " in_sr = 4326, \n", + " out_sr = cities_fset.spatial_reference['latestWkid'],\n", + " gis = gis)\n", + " \n", + " # assign the updated values\n", + " new_feature.geometry = output_geometry[0]\n", + " new_feature.attributes['longitude'] = float(row[1]['longitude'])\n", + " new_feature.attributes['city_id'] = int(row[1]['city_id'])\n", + " new_feature.attributes['state'] = row[1]['state']\n", + " new_feature.attributes['capital'] = row[1]['capital']\n", + " new_feature.attributes['latitude'] = float(row[1]['latitude'])\n", + " new_feature.attributes['name'] = row[1]['name']\n", + " new_feature.attributes['pop2000'] = int(row[1]['pop2000'])\n", + " new_feature.attributes['pop2007'] = int(row[1]['pop2007'])\n", + " \n", + " #add this to the list of features to be updated\n", + " features_to_be_added.append(new_feature)" + ] + }, + { + "cell_type": "code", + "execution_count": 24, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "{\"geometry\": {\"x\": -10145683.671555312, \"y\": 3562570.141302621}, \"attributes\": {\"objectid\": 4, \"city_id\": 20, \"name\": \"Baton Rouge\", \"state\": \"LA\", \"capital\": \"State\", \"pop2000\": 227818, \"pop2007\": 228810, \"longitude\": -91.14022709999999, \"latitude\": 30.45809113, \"SHAPE\": {\"x\": -13680394.263900002, \"y\": 5949000.547900006, \"spatialReference\": {\"wkid\": 102100, \"latestWkid\": 3857}}}}" + ] + }, + "execution_count": 24, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "# take a look at one of the features we created\n", + "features_to_be_added[0]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Thus, we have created a `list` of `Feature` objects with appropriate attributes and geometries. Next, to add these new features to the feature layer, call the `edit_features()` method of the `FeatureLayer` object and pass the list of `Feature` objects to the `adds` parameter:" + ] + }, + { + "cell_type": "code", + "execution_count": 25, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "{'addResults': [{'objectId': 20, 'globalId': None, 'success': True},\n", + " {'objectId': 21, 'globalId': None, 'success': True},\n", + " {'objectId': 22, 'globalId': None, 'success': True},\n", + " {'objectId': 23, 'globalId': None, 'success': True},\n", + " {'objectId': 24, 'globalId': None, 'success': True},\n", + " {'objectId': 25, 'globalId': None, 'success': True},\n", + " {'objectId': 26, 'globalId': None, 'success': True},\n", + " {'objectId': 27, 'globalId': None, 'success': True},\n", + " {'objectId': 28, 'globalId': None, 'success': True},\n", + " {'objectId': 29, 'globalId': None, 'success': True},\n", + " {'objectId': 30, 'globalId': None, 'success': True},\n", + " {'objectId': 31, 'globalId': None, 'success': True},\n", + " {'objectId': 32, 'globalId': None, 'success': True},\n", + " {'objectId': 33, 'globalId': None, 'success': True},\n", + " {'objectId': 34, 'globalId': None, 'success': True},\n", + " {'objectId': 35, 'globalId': None, 'success': True},\n", + " {'objectId': 36, 'globalId': None, 'success': True},\n", + " {'objectId': 37, 'globalId': None, 'success': True},\n", + " {'objectId': 38, 'globalId': None, 'success': True},\n", + " {'objectId': 39, 'globalId': None, 'success': True},\n", + " {'objectId': 40, 'globalId': None, 'success': True},\n", + " {'objectId': 41, 'globalId': None, 'success': True},\n", + " {'objectId': 42, 'globalId': None, 'success': True},\n", + " {'objectId': 43, 'globalId': None, 'success': True},\n", + " {'objectId': 44, 'globalId': None, 'success': True},\n", + " {'objectId': 45, 'globalId': None, 'success': True},\n", + " {'objectId': 46, 'globalId': None, 'success': True},\n", + " {'objectId': 47, 'globalId': None, 'success': True},\n", + " {'objectId': 48, 'globalId': None, 'success': True},\n", + " {'objectId': 49, 'globalId': None, 'success': True},\n", + " {'objectId': 50, 'globalId': None, 'success': True},\n", + " {'objectId': 51, 'globalId': None, 'success': True}],\n", + " 'updateResults': [],\n", + " 'deleteResults': [],\n", + " 'attachments': {'addResults': [], 'updateResults': [], 'deleteResults': []}}" + ] + }, + "execution_count": 25, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "cities_flayer.edit_features(adds = features_to_be_added)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Thus, we have successfully applied edits from second csv file. Next let us look at how we can apply edits from third csv file.\n", + "\n", + "### Apply edits from third spreadsheet\n", + "The next set of updates have arrived and are stored in `capitals_annex.csv`. We are told it contains additional columns for each of the features that we want to add to the feature layer.\n", + "\n", + "To start with, let us read the third csv file. Note in this sample, data is stored in csv. In reality, it could be from your enterprise database or any other data source." + ] + }, + { + "cell_type": "code", + "execution_count": 26, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "
\n", + "\n", + "\n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + "
city_idnameclasswhiteblackameri_esasianhawn_plhispanicmalesfemales
01HonoluluCensus Designated Place7309360386892075882545716229182628189029
12JuneauCity and Borough229692483496143811610401546915242
23Boise CityCity17120414371300387030284109201493773
34OlympiaCity36246805553247312518632031922195
45SalemCity113746175020643304643199736875268172
\n", + "
" + ], + "text/plain": [ + " city_id name class white black ameri_es \\\n", + "0 1 Honolulu Census Designated Place 73093 6038 689 \n", + "1 2 Juneau City and Borough 22969 248 3496 \n", + "2 3 Boise City City 171204 1437 1300 \n", + "3 4 Olympia City 36246 805 553 \n", + "4 5 Salem City 113746 1750 2064 \n", + "\n", + " asian hawn_pl hispanic males females \n", + "0 207588 25457 16229 182628 189029 \n", + "1 1438 116 1040 15469 15242 \n", + "2 3870 302 8410 92014 93773 \n", + "3 2473 125 1863 20319 22195 \n", + "4 3304 643 19973 68752 68172 " + ] + }, + "execution_count": 26, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "# read the third csv set\n", + "csv3 = 'data/updating_gis_content/capitals_annex.csv'\n", + "cities_df_3 = pd.read_csv(csv3)\n", + "cities_df_3.head()" + ] + }, + { + "cell_type": "code", + "execution_count": 27, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "(51, 11)" + ] + }, + "execution_count": 27, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "#find the number of rows in the third csv\n", + "cities_df_3.shape" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The `capitals_annex.csv` does not add new features, instead it adds additional attribute columns to existing features. It has 51 rows which were found to match the 19 + 32 rows from first and second csv files. The columns `City_ID` and `NAME` are common to all 3 spreadsheets. Next let us take a look at how we can append this additional attribute information to our feature layer.\n", + "\n", + "#### Inspecting existing fields of the feature layer\n", + "The `manager` property of the `FeatureLayer` object exposes a set of methods to read and update the properties and definition of feature layers." + ] + }, + { + "cell_type": "code", + "execution_count": 28, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "{\n", + " \"name\": \"city_id\",\n", + " \"type\": \"esriFieldTypeInteger\",\n", + " \"alias\": \"city_id\",\n", + " \"domain\": null,\n", + " \"editable\": true,\n", + " \"nullable\": true,\n", + " \"sqlType\": \"sqlTypeInteger\",\n", + " \"length\": 10\n", + "}" + ] + }, + "execution_count": 28, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "#Get the existing list of fields on the cities feature layer\n", + "cities_fields = cities_flayer.manager.properties.fields\n", + "\n", + "# Your feature layer may have multiple fields, \n", + "# instead of printing all, let us take a look at one of the fields:\n", + "cities_fields[1]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "From above, we can see the representation of one of the fields. Let us loop through each of the fields and print the `name`, `alias`, `type` and `sqlType` properties" + ] + }, + { + "cell_type": "code", + "execution_count": 29, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "objectid | objectid | esriFieldTypeOID | sqlTypeInteger\n", + "city_id | city_id | esriFieldTypeInteger | sqlTypeInteger\n", + "name | name | esriFieldTypeString | sqlTypeVarchar\n", + "state | state | esriFieldTypeString | sqlTypeVarchar\n", + "capital | capital | esriFieldTypeString | sqlTypeVarchar\n", + "pop2000 | pop2000 | esriFieldTypeInteger | sqlTypeInteger\n", + "pop2007 | pop2007 | esriFieldTypeInteger | sqlTypeInteger\n", + "longitude | longitude | esriFieldTypeDouble | sqlTypeNumeric\n", + "latitude | latitude | esriFieldTypeDouble | sqlTypeNumeric\n" + ] + } + ], + "source": [ + "for field in cities_fields:\n", + " print(f\"{field.name:13}| {field.alias:13}| {field.type:25}| {field.sqlType}\")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Preparing additional columns to add to the feature layer\n", + "Now that we have an idea of how the fields are defined, we can go ahead and append new fields to the layer's definition. Once we compose a list of new fields, by calling the `add_to_definition()` method we can push those changes to the feature layer. Once the feature layer's definition is updated with new fields, we can loop through each feature and add the appropriate attribute values.\n", + "\n", + "To compose a list of new fields to be added, we start by making a copy of one of the fields as a template and start editing it. One easy part in this example is, all new fields that need to be added except one, are of the same data type: integer. With your data, this may not be the case. In such instances, you can add each field individually." + ] + }, + { + "cell_type": "code", + "execution_count": 30, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "{'name': 'city_id',\n", + " 'type': 'esriFieldTypeInteger',\n", + " 'alias': 'city_id',\n", + " 'domain': None,\n", + " 'editable': True,\n", + " 'nullable': True,\n", + " 'sqlType': 'sqlTypeInteger',\n", + " 'length': 10}" + ] + }, + "execution_count": 30, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "# get a template field\n", + "template_field = dict(deepcopy(cities_fields[1]))\n", + "template_field" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let us use pandas to get the list of fields that are **new** in spread sheet 3" + ] + }, + { + "cell_type": "code", + "execution_count": 31, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "['ameri_es',\n", + " 'asian',\n", + " 'black',\n", + " 'class',\n", + " 'females',\n", + " 'hawn_pl',\n", + " 'hispanic',\n", + " 'males',\n", + " 'white']" + ] + }, + "execution_count": 31, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "# get the list of new fields to add from the third spreadsheet, that are not in spread sheets 1,2\n", + "new_field_names = list(cities_df_3.columns.difference(cities_df_1.columns))\n", + "new_field_names" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now loop though each new field name and create a field dictionary using the template we created earlier. Except the field titled `class` all other fields are of type `integer`." + ] + }, + { + "cell_type": "code", + "execution_count": 32, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "9" + ] + }, + "execution_count": 32, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "fields_to_be_added = []\n", + "for new_field_name in new_field_names:\n", + " current_field = deepcopy(template_field)\n", + " if new_field_name.lower() == 'class':\n", + " current_field['sqlType'] = 'sqlTypeVarchar'\n", + " current_field['type'] = 'esriFieldTypeString'\n", + " current_field['length'] = 8000\n", + " current_field['name'] = new_field_name.lower()\n", + " current_field['alias'] = new_field_name\n", + " fields_to_be_added.append(current_field)\n", + " \n", + "len(fields_to_be_added)" + ] + }, + { + "cell_type": "code", + "execution_count": 33, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "{'name': 'class',\n", + " 'type': 'esriFieldTypeString',\n", + " 'alias': 'class',\n", + " 'domain': None,\n", + " 'editable': True,\n", + " 'nullable': True,\n", + " 'sqlType': 'sqlTypeVarchar',\n", + " 'length': 8000}" + ] + }, + "execution_count": 33, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "#inspect one of the fields\n", + "fields_to_be_added[3]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Adding additional fields to the feature layer\n", + "The list of new fields we composed can be pushed to the server by calling `add_to_definition()` method on the `manager` property." + ] + }, + { + "cell_type": "code", + "execution_count": 34, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "{'success': True}" + ] + }, + "execution_count": 34, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "cities_flayer.manager.add_to_definition({'fields':fields_to_be_added})" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Thus, we have successfully added new fields to our feature layer. Let us verify the new columns show up:" + ] + }, + { + "cell_type": "code", + "execution_count": 35, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "18" + ] + }, + "execution_count": 35, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "new_cities_fields = cities_flayer.manager.properties.fields\n", + "len(new_cities_fields)" + ] + }, + { + "cell_type": "code", + "execution_count": 36, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "objectid | esriFieldTypeOID\n", + "city_id | esriFieldTypeInteger\n", + "name | esriFieldTypeString\n", + "state | esriFieldTypeString\n", + "capital | esriFieldTypeString\n", + "pop2000 | esriFieldTypeInteger\n", + "pop2007 | esriFieldTypeInteger\n", + "longitude | esriFieldTypeDouble\n", + "latitude | esriFieldTypeDouble\n", + "ameri_es | esriFieldTypeInteger\n", + "asian | esriFieldTypeInteger\n", + "black | esriFieldTypeInteger\n", + "class | esriFieldTypeString\n", + "females | esriFieldTypeInteger\n", + "hawn_pl | esriFieldTypeInteger\n", + "hispanic | esriFieldTypeInteger\n", + "males | esriFieldTypeInteger\n", + "white | esriFieldTypeInteger\n" + ] + } + ], + "source": [ + "for field in new_cities_fields:\n", + " print(f\"{field.name:10}| {field.type}\")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Adding attribute values to the new columns\n", + "Next we can loop through each row in the third csv and add the new attribute values for these newly created columns." + ] + }, + { + "cell_type": "code", + "execution_count": 37, + "metadata": {}, + "outputs": [], + "source": [ + "# Run a fresh query on the feature layer so it includes the new features from\n", + "# csv2 and new columns from csv3\n", + "cities_fset2 = cities_flayer.query()\n", + "cities_features2 = cities_fset2.features" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Loop through each row in the third spreadsheet, find the corresponding feature by matching the `city_id` value and apply the attribute values for the new fields." + ] + }, + { + "cell_type": "code", + "execution_count": 38, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "1 Adding additional attributes for: Honolulu\n", + "2 Adding additional attributes for: Juneau\n", + "3 Adding additional attributes for: Boise City\n", + "4 Adding additional attributes for: Olympia\n", + "5 Adding additional attributes for: Salem\n", + "6 Adding additional attributes for: Carson\n", + "7 Adding additional attributes for: Sacramento\n", + "8 Adding additional attributes for: Phoenix\n", + "9 Adding additional attributes for: Salt Lake City\n", + "10 Adding additional attributes for: Cheyenne\n", + "11 Adding additional attributes for: Denver\n", + "12 Adding additional attributes for: Santa Fe\n", + "13 Adding additional attributes for: Oklahoma City\n", + "14 Adding additional attributes for: Topeka\n", + "15 Adding additional attributes for: Lincoln\n", + "16 Adding additional attributes for: Des Moines\n", + "17 Adding additional attributes for: Jefferson City\n", + "18 Adding additional attributes for: Little Rock\n", + "19 Adding additional attributes for: Austin\n", + "20 Adding additional attributes for: Baton Rouge\n", + "21 Adding additional attributes for: Helena\n", + "22 Adding additional attributes for: Bismarck\n", + "23 Adding additional attributes for: Pierre\n", + "24 Adding additional attributes for: St. Paul\n", + "25 Adding additional attributes for: Madison\n", + "26 Adding additional attributes for: Lansing\n", + "27 Adding additional attributes for: Augusta\n", + "28 Adding additional attributes for: Montpelier\n", + "29 Adding additional attributes for: Albany\n", + "30 Adding additional attributes for: Boston\n", + "31 Adding additional attributes for: Concord\n", + "32 Adding additional attributes for: Tallahassee\n", + "33 Adding additional attributes for: Jackson\n", + "34 Adding additional attributes for: Nashville\n", + "35 Adding additional attributes for: Montgomery\n", + "36 Adding additional attributes for: Springfield\n", + "37 Adding additional attributes for: Indianapolis\n", + "38 Adding additional attributes for: Frankfort\n", + "39 Adding additional attributes for: Columbus\n", + "40 Adding additional attributes for: Charleston\n", + "41 Adding additional attributes for: Atlanta\n", + "42 Adding additional attributes for: Columbia\n", + "43 Adding additional attributes for: Raleigh\n", + "44 Adding additional attributes for: Richmond\n", + "45 Adding additional attributes for: Harrisburg\n", + "46 Adding additional attributes for: Trenton\n", + "47 Adding additional attributes for: Dover\n", + "48 Adding additional attributes for: Washington\n", + "49 Adding additional attributes for: Annapolis\n", + "50 Adding additional attributes for: Hartford\n", + "51 Adding additional attributes for: Providence\n" + ] + } + ], + "source": [ + "features_for_update = []\n", + "for city_id in cities_df_3['city_id']:\n", + " # get the matching row from csv\n", + " matching_row = cities_df_3.where(cities_df_3.city_id == city_id).dropna()\n", + " \n", + " print(str(city_id) + \" Adding additional attributes for: \" + matching_row['name'].values[0])\n", + " \n", + " # get the feature to be updated\n", + " original_feature = [f for f in cities_features2 if f.attributes['city_id'] == city_id][0]\n", + " feature_to_be_updated = deepcopy(original_feature)\n", + " \n", + " # assign the updated values\n", + " feature_to_be_updated.attributes['class'] = matching_row['class'].values[0]\n", + " feature_to_be_updated.attributes['white'] = int(matching_row['white'])\n", + " feature_to_be_updated.attributes['black'] = int(matching_row['black'])\n", + " feature_to_be_updated.attributes['ameri_es'] = int(matching_row['ameri_es'])\n", + " feature_to_be_updated.attributes['asian'] = int(matching_row['asian'])\n", + " feature_to_be_updated.attributes['hawn_pl'] = int(matching_row['hawn_pl'])\n", + " feature_to_be_updated.attributes['hispanic'] = int(matching_row['hispanic'])\n", + " feature_to_be_updated.attributes['males'] = int(matching_row['males'])\n", + " feature_to_be_updated.attributes['females'] = int(matching_row['females'])\n", + " \n", + " #add this to the list of features to be updated\n", + " features_for_update.append(feature_to_be_updated)" + ] + }, + { + "cell_type": "code", + "execution_count": 39, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "{\"geometry\": {\"x\": -7950674.8190312665, \"y\": 5134585.3226086125}, \"attributes\": {\"objectid\": 51, \"city_id\": 51, \"name\": \"Providence\", \"state\": \"RI\", \"capital\": \"State\", \"pop2000\": 173618, \"pop2007\": 183731, \"longitude\": -71.42212709, \"latitude\": 41.82355466, \"ameri_es\": 1975, \"asian\": 10432, \"black\": 25243, \"class\": \"City\", \"females\": 90583, \"hawn_pl\": 270, \"hispanic\": 52146, \"males\": 83035, \"white\": 94666}}" + ] + }, + "execution_count": 39, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "# inspect one of the features\n", + "features_for_update[-1]" + ] + }, + { + "cell_type": "code", + "execution_count": 40, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "{'addResults': [],\n", + " 'updateResults': [{'objectId': 1, 'globalId': None, 'success': True},\n", + " {'objectId': 2, 'globalId': None, 'success': True},\n", + " {'objectId': 3, 'globalId': None, 'success': True},\n", + " {'objectId': 4, 'globalId': None, 'success': True},\n", + " {'objectId': 5, 'globalId': None, 'success': True},\n", + " {'objectId': 6, 'globalId': None, 'success': True},\n", + " {'objectId': 7, 'globalId': None, 'success': True},\n", + " {'objectId': 8, 'globalId': None, 'success': True},\n", + " {'objectId': 9, 'globalId': None, 'success': True},\n", + " {'objectId': 10, 'globalId': None, 'success': True},\n", + " {'objectId': 11, 'globalId': None, 'success': True},\n", + " {'objectId': 12, 'globalId': None, 'success': True},\n", + " {'objectId': 13, 'globalId': None, 'success': True},\n", + " {'objectId': 14, 'globalId': None, 'success': True},\n", + " {'objectId': 15, 'globalId': None, 'success': True},\n", + " {'objectId': 16, 'globalId': None, 'success': True},\n", + " {'objectId': 17, 'globalId': None, 'success': True},\n", + " {'objectId': 18, 'globalId': None, 'success': True},\n", + " {'objectId': 19, 'globalId': None, 'success': True},\n", + " {'objectId': 20, 'globalId': None, 'success': True},\n", + " {'objectId': 21, 'globalId': None, 'success': True},\n", + " {'objectId': 22, 'globalId': None, 'success': True},\n", + " {'objectId': 23, 'globalId': None, 'success': True},\n", + " {'objectId': 24, 'globalId': None, 'success': True},\n", + " {'objectId': 25, 'globalId': None, 'success': True},\n", + " {'objectId': 26, 'globalId': None, 'success': True},\n", + " {'objectId': 27, 'globalId': None, 'success': True},\n", + " {'objectId': 28, 'globalId': None, 'success': True},\n", + " {'objectId': 29, 'globalId': None, 'success': True},\n", + " {'objectId': 30, 'globalId': None, 'success': True},\n", + " {'objectId': 31, 'globalId': None, 'success': True},\n", + " {'objectId': 32, 'globalId': None, 'success': True},\n", + " {'objectId': 33, 'globalId': None, 'success': True},\n", + " {'objectId': 34, 'globalId': None, 'success': True},\n", + " {'objectId': 35, 'globalId': None, 'success': True},\n", + " {'objectId': 36, 'globalId': None, 'success': True},\n", + " {'objectId': 37, 'globalId': None, 'success': True},\n", + " {'objectId': 38, 'globalId': None, 'success': True},\n", + " {'objectId': 39, 'globalId': None, 'success': True},\n", + " {'objectId': 40, 'globalId': None, 'success': True},\n", + " {'objectId': 41, 'globalId': None, 'success': True},\n", + " {'objectId': 42, 'globalId': None, 'success': True},\n", + " {'objectId': 43, 'globalId': None, 'success': True},\n", + " {'objectId': 44, 'globalId': None, 'success': True},\n", + " {'objectId': 45, 'globalId': None, 'success': True},\n", + " {'objectId': 46, 'globalId': None, 'success': True},\n", + " {'objectId': 47, 'globalId': None, 'success': True},\n", + " {'objectId': 48, 'globalId': None, 'success': True},\n", + " {'objectId': 49, 'globalId': None, 'success': True},\n", + " {'objectId': 50, 'globalId': None, 'success': True},\n", + " {'objectId': 51, 'globalId': None, 'success': True}],\n", + " 'deleteResults': [],\n", + " 'attachments': {'addResults': [], 'updateResults': [], 'deleteResults': []}}" + ] + }, + "execution_count": 40, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "# apply the edits to the feature layer\n", + "cities_flayer.edit_features(updates= features_for_update)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Verify the changes made so far\n", + "Let us run another query on the feature layer and visualize a few rows." + ] + }, + { + "cell_type": "code", + "execution_count": 41, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "
\n", + "\n", + "\n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + "
SHAPEameri_esasianblackcapitalcity_idclassfemaleshawn_plhispaniclatitudelongitudemalesnameobjectidpop2000pop2007statewhite
0{\"x\": -17568824.553, \"y\": 2428377.352700006, \"...6892075886038State1Census Designated Place189029254571622921.305782-157.823436182628Honolulu1371657378587HI73093
1{\"x\": -14973760.769500002, \"y\": 8041504.674200...34961438248State2City and Borough15242116104058.351418-134.51158215469Juneau23071131592AK22969
2{\"x\": -12939516.521100001, \"y\": 5405860.248099...130038701437State3City93773302841043.613736-116.23765592014Boise City3185787203529ID171204
3{\"x\": -13680394.263852797, \"y\": 5949000.547924...5532473805State4City22195125186347.042418-122.89307320319Olympia44251445523WA36246
4{\"x\": -13695542.842799995, \"y\": 5610682.544100...206433041750State5City681726431997344.931109-123.02915568752Salem5136924152039OR113746
\n", + "
" + ], + "text/plain": [ + " SHAPE ameri_es asian black \\\n", + "0 {\"x\": -17568824.553, \"y\": 2428377.352700006, \"... 689 207588 6038 \n", + "1 {\"x\": -14973760.769500002, \"y\": 8041504.674200... 3496 1438 248 \n", + "2 {\"x\": -12939516.521100001, \"y\": 5405860.248099... 1300 3870 1437 \n", + "3 {\"x\": -13680394.263852797, \"y\": 5949000.547924... 553 2473 805 \n", + "4 {\"x\": -13695542.842799995, \"y\": 5610682.544100... 2064 3304 1750 \n", + "\n", + " capital city_id class females hawn_pl hispanic \\\n", + "0 State 1 Census Designated Place 189029 25457 16229 \n", + "1 State 2 City and Borough 15242 116 1040 \n", + "2 State 3 City 93773 302 8410 \n", + "3 State 4 City 22195 125 1863 \n", + "4 State 5 City 68172 643 19973 \n", + "\n", + " latitude longitude males name objectid pop2000 pop2007 \\\n", + "0 21.305782 -157.823436 182628 Honolulu 1 371657 378587 \n", + "1 58.351418 -134.511582 15469 Juneau 2 30711 31592 \n", + "2 43.613736 -116.237655 92014 Boise City 3 185787 203529 \n", + "3 47.042418 -122.893073 20319 Olympia 4 42514 45523 \n", + "4 44.931109 -123.029155 68752 Salem 5 136924 152039 \n", + "\n", + " state white \n", + "0 HI 73093 \n", + "1 AK 22969 \n", + "2 ID 171204 \n", + "3 WA 36246 \n", + "4 OR 113746 " + ] + }, + "execution_count": 41, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "cities_fset3 = cities_flayer.query()\n", + "cities_fset3.sdf.head(5)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Conclusion\n", + "In this sample, we observed an edit intensive method to keep feature layers updated. We published data from first spreadsheet as a feature layer. We then updated existing features from second spread sheet (used geometry module to project the coordinates in the process), and added new features. The third spreadsheet presented additional attribute columns which were added to the feature layer by editing its definition and then updating the features with this additional data.\n", + "\n", + "This method is editing intensive and you may choose this when the number of features to edit is less or if you needed to selectively update certain features as updates come in.\n", + "\n", + "An alternate method is to overwrite the feature layer altogether when you always have current information coming in. This method is explained in the sample [Overwriting feature layers](https://developers.arcgis.com/python/sample-notebooks/overwriting-feature-layers)" + ] + } + ], + "metadata": { + "anaconda-cloud": {}, + "esriNotebookRuntime": { + "notebookRuntimeName": "ArcGIS Notebook Python 3 Standard", + "notebookRuntimeVersion": "4.0" + }, + "kernelspec": { + "display_name": "Python 3 (ipykernel)", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.11.0" + }, + "toc": { + "base_numbering": 1, + "nav_menu": {}, + "number_sections": true, + "sideBar": true, + "skip_h1_title": false, + "title_cell": "Table of Contents", + "title_sidebar": "Contents", + "toc_cell": true, + "toc_position": {}, + "toc_section_display": true, + "toc_window_display": true + } + }, + "nbformat": 4, + "nbformat_minor": 4 +}