can some help with this error? I have tried another solution by this post, but it didn’t work..
I have also tried it in terminal and got the same error
(beam) root@jetson-nx:/# python
Python 3.6.10 | packaged by conda-forge | (default, Apr 24 2020, 16:15:58)
[GCC 7.3.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import os
>>> import re
>>> import json
>>> import string
>>> import numpy as np
>>> import tensorflow as tf
2020-06-13 04:14:40.151265: I tensorflow/stream_executor/platform/default/dso_loader.cc:48] Successfully opened dynamic library libcudart.so.10.2
2020-06-13 04:14:42.824809: I tensorflow/stream_executor/platform/default/dso_loader.cc:48] Successfully opened dynamic library libnvinfer.so.7
2020-06-13 04:14:42.828041: I tensorflow/stream_executor/platform/default/dso_loader.cc:48] Successfully opened dynamic library libnvinfer_plugin.so.7
>>> from tensorflow import keras
>>> from tensorflow.keras import layers
>>> from tokenizers import BertWordPieceTokenizer
>>> from transformers import BertTokenizer, TFBertModel, BertConfig
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/root/archiconda3/envs/beam/lib/python3.6/site-packages/transformers/__init__.py", line 99, in <module>
from .pipelines import (
File "/root/archiconda3/envs/beam/lib/python3.6/site-packages/transformers/pipelines.py", line 36, in <module>
from .tokenization_auto import AutoTokenizer
File "/root/archiconda3/envs/beam/lib/python3.6/site-packages/transformers/tokenization_auto.py", line 48, in <module>
from .tokenization_camembert import CamembertTokenizer
File "/root/archiconda3/envs/beam/lib/python3.6/site-packages/transformers/tokenization_camembert.py", line 23, in <module>
import sentencepiece as spm
File "/root/archiconda3/envs/beam/lib/python3.6/site-packages/sentencepiece.py", line 15, in <module>
import _sentencepiece
ImportError: /usr/lib/aarch64-linux-gnu/libtcmalloc_minimal.so.4: cannot allocate memory in static TLS block