Skip to content

ENH: Getting NEP 50 behavior in the array API compat library #22341

@asmeurer

Description

@asmeurer

Proposed new feature or change:

In #21626 NEP 50 is implemented as opt-in with either a global setting or a context manager. I am working on creating an array API compatibility library (see WIP at https://github.com/data-apis/numpy-array-api-compat). Unlike numpy.array_api, this compatibility library is separate from numpy, so that it can be updated independently. It also aims to be usable for downstream libraries implementing against the array API. So in particular, it is not a strict implementation like numpy.array_api. For instance, it will not raise errors for dtype combinations that are not required by the spec but are allowed by NumPy (see also https://numpy.org/doc/stable/reference/array_api.html).

For this library, I'd like to keep the wrapping to a minimum, and in particular, I'd like to avoid creating a wrapper class for arrays like was done in numpy.array_api, and instead just use np.ndarray directly. Since we aren't going for strictness, this isn't a problem, but one issue is type promotion. The spec requires no value-based type promotion, along the lines of NEP 50.

The question then is how can we achieve this, so that users of the compatibility library will get something that looks like NumPy but follows the array API specification. Neither of the implemented solutions looks very good for this. We could set the global flag, but this could break other libraries in the same process that use NumPy outside of the array API. In general, it seems like a bad practice for a library to set a global flag. The context manager doesn't seem to be helpful, since most instances of type promotion issues come from operators. If a library (like scikit-learn) has some code like

import numpy_array_api_compat as np
a = np.asarray(...)
b = np.asarray(...)
# suppose a is an array and b is a scalar, so that NEP 50 is relevant
c = a + b

Then the only way it can make a + b do the right type promotion is for scikit-learn itself to add the context manager. But it's annoying for every implementing code to do this.

Ideally, we'd be able to set a promotion_state = 'weak' flag on arrays and scalars which get carried around by subsequent operations. I can then wrap asarray and every other array API function that creates arrays in the compat library to set this flag, so that any (array API compatible) usage of those arrays automatically gets the correct type promotion behavior.

I'm not sure if this is feasible. If anyone has any other suggestions how we could achieve this, let me know. The options I can think of are

  • Add a promotion_state flag to ndarray/scalars.
  • Make a wrapper ndarray class in the compat library, similar to numpy.array_api. I'd prefer to avoid this but we can consider it if it's unavoidable.
  • Require implementing libraries to use the context manager.
  • Ignore type promotion differences between NumPy and the array API (this option is more or less the same as the previous one).

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions

      pFad - Phonifier reborn

      Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

      Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


      Alternative Proxies:

      Alternative Proxy

      pFad Proxy

      pFad v3 Proxy

      pFad v4 Proxy