Content-Length: 266820 | pFad | http://github.com/atalman/pytorch/commit/dd558e4a45338502087b37117356b182384b15de

43 Use fabi-version=11 to ensure compatibility between gcc7 and gcc9 bin… · atalman/pytorch@dd558e4 · GitHub
Skip to content

Commit dd558e4

Browse files
committed
Use fabi-version=11 to ensure compatibility between gcc7 and gcc9 binaries (pytorch#81058) (pytorch#81058)
Summary: Fixes: pytorch#80489 Test using cuda 11.3 manywheel binary: ``` import torch print(torch.__version__) print(torch._C._PYBIND11 (pytorch@d55b25a633b7e2e6122becf6dbdf0528df6e8b13)_BUILD_ABI) ```` Output ``` 1.13.0.dev20220707+cu113 _cxxabi1011 ``` Functorch test torch : 1.13.0.dev20220707+cu113, functorch with cu102 ``` import torch print(torch.__version__) print(torch._C._PYBIND11 (pytorch@d55b25a633b7e2e6122becf6dbdf0528df6e8b13)_BUILD_ABI) from functorch import vmap x = torch.randn(2, 3, 5) vmap(lambda x: x, out_dims=3)(x) ``` Output ``` 1.13.0.dev20220707+cu113 _cxxabi1011 /home/atalman/temp/testc1.py:5: UserWarning: Failed to initialize NumPy: No module named 'numpy' (Triggered internally at ../torch/csrc/utils/tensor_numpy.cpp:73.) x = torch.randn(2, 3, 5) Traceback (most recent call last): File "/home/atalman/temp/testc1.py", line 6, in <module> vmap(lambda x: x, out_dims=3)(x) File "/home/atalman/conda/lib/python3.9/site-packages/functorch/_src/vmap.py", line 361, in wrapped return _flat_vmap( File "/home/atalman/conda/lib/python3.9/site-packages/functorch/_src/vmap.py", line 488, in _flat_vmap return _unwrap_batched(batched_outputs, out_dims, vmap_level, batch_size, func) File "/home/atalman/conda/lib/python3.9/site-packages/functorch/_src/vmap.py", line 165, in _unwrap_batched flat_outputs = [ File "/home/atalman/conda/lib/python3.9/site-packages/functorch/_src/vmap.py", line 166, in <listcomp> _remove_batch_dim(batched_output, vmap_level, batch_size, out_dim) IndexError: Dimension out of range (expected to be in range of [-3, 2], but got 3) ``` Related Builder PR: pytorch/builder#1083 Test PR: pytorch#81232 Pull Request resolved: pytorch#81058 Approved by: https://github.com/zou3519, https://github.com/malfet Test Plan: contbuild & OSS CI, see https://hud.pytorch.org/commit/pytorch/pytorch/d552ba3b4f53da9b6a5f6e0463111e43b367ef8a Reviewed By: DanilBaibak Differential Revision: D37813240 Pulled By: atalman fbshipit-source-id: 94d94e777b0e9d5da106173c06117b3019ba71c4
1 parent 1680cd0 commit dd558e4

File tree

1 file changed

+4
-0
lines changed

1 file changed

+4
-0
lines changed

CMakeLists.txt

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -44,6 +44,10 @@ if(DEFINED GLIBCXX_USE_CXX11_ABI)
4444
if(${GLIBCXX_USE_CXX11_ABI} EQUAL 1)
4545
set(CXX_STANDARD_REQUIRED ON)
4646
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -D_GLIBCXX_USE_CXX11_ABI=1")
47+
else()
48+
# Please note this is required in order to ensure compatibility between gcc 9 and gcc 7
49+
# This could be removed when all Linux PyTorch binary builds are compiled by the same toolchain again
50+
string(APPEND CMAKE_CXX_FLAGS " -fabi-version=11")
4751
endif()
4852
endif()
4953

0 commit comments

Comments
 (0)








ApplySandwichStrip

pFad - (p)hone/(F)rame/(a)nonymizer/(d)eclutterfier!      Saves Data!


--- a PPN by Garber Painting Akron. With Image Size Reduction included!

Fetched URL: http://github.com/atalman/pytorch/commit/dd558e4a45338502087b37117356b182384b15de

Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy