Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

adding in Python 3.12 support, warnings, and doc updates #301

Merged
merged 3 commits into from
May 23, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .github/workflows/test-package.yml
Original file line number Diff line number Diff line change
Expand Up @@ -62,7 +62,7 @@ jobs:
strategy:
fail-fast: false
matrix:
python-version: [3.9, '3.10', '3.11']
python-version: [3.9, '3.10', '3.11', '3.12']

env:
PYTHON_VERSION: ${{ matrix.python-version }}
Expand Down
2 changes: 1 addition & 1 deletion CODEOWNERS
Validating CODEOWNERS rules …
Original file line number Diff line number Diff line change
@@ -1 +1 @@
* @fdosani @ak-gupta @jdawang @gladysteh99 @NikhilJArora
* @fdosani @ak-gupta @jdawang @gladysteh99
7 changes: 4 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,6 @@ If you would like to use Spark or any other backends please make sure you instal
pip install datacompy[spark]
pip install datacompy[dask]
pip install datacompy[duckdb]
pip install datacompy[polars]
pip install datacompy[ray]

```
Expand All @@ -47,7 +46,7 @@ The original ``SparkCompare`` implementation differs from all the other native i
If you wish to use the old SparkCompare moving forward you can

```python
import datacompy.legacy.LegacySparkCompare
from datacompy.legacy import LegacySparkCompare
```

#### Supported versions and dependncies
Expand Down Expand Up @@ -79,7 +78,9 @@ With version ``0.12.0``:


> [!NOTE]
> At the current time Python `3.12` is not supported by Spark and also Ray within Fugue.
> At the current time Python `3.12` is not supported by Spark and also Ray within Fugue.
> If you are using Python `3.12` and above, please note that not all functioanlity will be supported.
> Pandas and Polars support should work fine and are tested.

## Supported backends

Expand Down
16 changes: 15 additions & 1 deletion datacompy/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,10 @@
# See the License for the specific language governing permissions and
# limitations under the License.

__version__ = "0.12.0"
__version__ = "0.12.1"

import platform
from warnings import warn

from datacompy.core import *
from datacompy.fugue import (
Expand All @@ -27,3 +30,14 @@
)
from datacompy.polars import PolarsCompare
from datacompy.spark import SparkCompare

major = platform.python_version_tuple()[0]
minor = platform.python_version_tuple()[1]

if major == "3" and minor >= "12":
warn(
"Python 3.12 and above currently is not supported by Spark and Ray. "
"Please note that some functionality will not work and currently is not supported.",
UserWarning,
stacklevel=2,
)
5 changes: 3 additions & 2 deletions docs/source/install.rst
Original file line number Diff line number Diff line change
Expand Up @@ -2,9 +2,10 @@
Installation
============

.. note::
.. important::

Moving forward ``datacompy`` will not support Python 2. Please make sure you are using Python 3.8+
If you are using Python 3.12 and above, please note that not all functioanlity will be supported.
Pandas and Polars support should work fine and are tested.


PyPI (basic)
Expand Down
2 changes: 1 addition & 1 deletion docs/source/spark_usage.rst
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ Spark (Pandas on Spark) Usage

.. code-block:: python

import datacompy.legacy.LegacySparkCompare
from datacompy.legacy import LegacySparkCompare



Expand Down
13 changes: 10 additions & 3 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,13 @@ maintainers = [
{ name="Faisal Dosani", email="faisal.dosani@capitalone.com" }
]
license = {text = "Apache Software License"}
dependencies = ["pandas<=2.2.2,>=0.25.0", "numpy<=1.26.4,>=1.22.0", "ordered-set<=4.1.0,>=4.0.2", "fugue<=0.8.7,>=0.8.7"]
dependencies = [
"pandas<=2.2.2,>=0.25.0",
"numpy<=1.26.4,>=1.22.0",
"ordered-set<=4.1.0,>=4.0.2",
"fugue<=0.8.7,>=0.8.7",
"polars<=0.20.27,>=0.20.4",
]
requires-python = ">=3.9.0"
classifiers = [
"Intended Audience :: Developers",
Expand All @@ -23,6 +29,7 @@ classifiers = [
"Programming Language :: Python :: 3.9",
"Programming Language :: Python :: 3.10",
"Programming Language :: Python :: 3.11",
"Programming Language :: Python :: 3.12",
]

dynamic = ["version"]
Expand Down Expand Up @@ -54,7 +61,6 @@ python-tag = "py3"

[project.optional-dependencies]
duckdb = ["fugue[duckdb]"]
polars = ["polars"]
spark = ["pyspark>=3.1.1; python_version < \"3.11\"", "pyspark>=3.4; python_version >= \"3.11\""]
dask = ["fugue[dask]"]
ray = ["fugue[ray]"]
Expand All @@ -65,7 +71,7 @@ tests-spark = ["pytest", "pytest-cov", "pytest-spark", "spark"]
qa = ["pre-commit", "black", "isort", "mypy", "pandas-stubs"]
build = ["build", "twine", "wheel"]
edgetest = ["edgetest", "edgetest-conda"]
dev = ["datacompy[duckdb]", "datacompy[polars]", "datacompy[spark]", "datacompy[docs]", "datacompy[tests]", "datacompy[tests-spark]", "datacompy[qa]", "datacompy[build]"]
dev = ["datacompy[duckdb]", "datacompy[spark]", "datacompy[docs]", "datacompy[tests]", "datacompy[tests-spark]", "datacompy[qa]", "datacompy[build]"]

[tool.isort]
multi_line_output = 3
Expand Down Expand Up @@ -96,4 +102,5 @@ upgrade = [
"numpy",
"ordered-set",
"fugue",
"polars",
]
Loading