Keras BERT example with Error: libtcmalloc_minimal.so.4: cannot allocate memory in static TLS block

Hi, I am trying to run this example from keras.io in a Jupyter Lab

but got following when I run the code below,

import os
import re
import json
import string
import numpy as np
import tensorflow as tf
from tensorflow import keras
from tensorflow.keras import layers
from tokenizers import BertWordPieceTokenizer
from transformers import BertTokenizer, TFBertModel, BertConfig

max_len = 384
configuration = BertConfig()  # default paramters and configuration for BERT

Error:

---------------------------------------------------------------------------
ImportError                               Traceback (most recent call last)
<ipython-input-1-1f9a8957001d> in <module>
      6 import tensorflow as tf
      7 from tokenizers import BertWordPieceTokenizer
----> 8 from transformers import BertTokenizer, TFBertModel, BertConfig
      9 from tensorflow import keras
     10 from tensorflow.keras import layers

~/archiconda3/envs/beam/lib/python3.6/site-packages/transformers/__init__.py in <module>
     97 
     98 # Pipelines
---> 99 from .pipelines import (
    100     CsvPipelineDataFormat,
    101     FeatureExtractionPipeline,

~/archiconda3/envs/beam/lib/python3.6/site-packages/transformers/pipelines.py in <module>
     34 from .file_utils import is_tf_available, is_torch_available
     35 from .modelcard import ModelCard
---> 36 from .tokenization_auto import AutoTokenizer
     37 from .tokenization_bert import BasicTokenizer
     38 from .tokenization_utils import PreTrainedTokenizer

~/archiconda3/envs/beam/lib/python3.6/site-packages/transformers/tokenization_auto.py in <module>
     46 from .tokenization_bert import BertTokenizer, BertTokenizerFast
     47 from .tokenization_bert_japanese import BertJapaneseTokenizer
---> 48 from .tokenization_camembert import CamembertTokenizer
     49 from .tokenization_ctrl import CTRLTokenizer
     50 from .tokenization_distilbert import DistilBertTokenizer, DistilBertTokenizerFast

~/archiconda3/envs/beam/lib/python3.6/site-packages/transformers/tokenization_camembert.py in <module>
     21 from typing import List, Optional
     22 
---> 23 import sentencepiece as spm
     24 
     25 from .tokenization_utils import PreTrainedTokenizer

~/archiconda3/envs/beam/lib/python3.6/site-packages/sentencepiece.py in <module>
     13     from . import _sentencepiece
     14 else:
---> 15     import _sentencepiece
     16 
     17 try:

ImportError: /usr/lib/aarch64-linux-gnu/libtcmalloc_minimal.so.4: cannot allocate memory in static TLS block

can some help with this error? I have tried another solution by this post, but it didn’t work…

I have also tried it in terminal and got the same error

(beam) root@jetson-nx:/# python
Python 3.6.10 | packaged by conda-forge | (default, Apr 24 2020, 16:15:58)
[GCC 7.3.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import os
>>> import re
>>> import json
>>> import string
>>> import numpy as np
>>> import tensorflow as tf
2020-06-13 04:14:40.151265: I tensorflow/stream_executor/platform/default/dso_loader.cc:48] Successfully opened dynamic library libcudart.so.10.2
2020-06-13 04:14:42.824809: I tensorflow/stream_executor/platform/default/dso_loader.cc:48] Successfully opened dynamic library libnvinfer.so.7
2020-06-13 04:14:42.828041: I tensorflow/stream_executor/platform/default/dso_loader.cc:48] Successfully opened dynamic library libnvinfer_plugin.so.7
>>> from tensorflow import keras
>>> from tensorflow.keras import layers
>>> from tokenizers import BertWordPieceTokenizer
>>> from transformers import BertTokenizer, TFBertModel, BertConfig
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/root/archiconda3/envs/beam/lib/python3.6/site-packages/transformers/__init__.py", line 99, in <module>
    from .pipelines import (
  File "/root/archiconda3/envs/beam/lib/python3.6/site-packages/transformers/pipelines.py", line 36, in <module>
    from .tokenization_auto import AutoTokenizer
  File "/root/archiconda3/envs/beam/lib/python3.6/site-packages/transformers/tokenization_auto.py", line 48, in <module>
    from .tokenization_camembert import CamembertTokenizer
  File "/root/archiconda3/envs/beam/lib/python3.6/site-packages/transformers/tokenization_camembert.py", line 23, in <module>
    import sentencepiece as spm
  File "/root/archiconda3/envs/beam/lib/python3.6/site-packages/sentencepiece.py", line 15, in <module>
    import _sentencepiece
ImportError: /usr/lib/aarch64-linux-gnu/libtcmalloc_minimal.so.4: cannot allocate memory in static TLS block

update: I tried this

export LD_PRELOAD=/usr/lib/aarch64-linux-gnu/libtcmalloc_minimal.so.4

and now it get a bit better but with this error

(beam) root@jetson-nx:/# python
Python 3.6.10 | packaged by conda-forge | (default, Apr 24 2020, 16:15:58)
[GCC 7.3.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import os
>>> import re
>>> import json
>>> import string
>>> import numpy as np
>>> import tensorflow as tf
2020-06-13 04:21:06.752965: I tensorflow/stream_executor/platform/default/dso_loader.cc:48] Successfully opened dynamic library libcudart.so.10.2
2020-06-13 04:21:09.365921: I tensorflow/stream_executor/platform/default/dso_loader.cc:48] Successfully opened dynamic library libnvinfer.so.7
2020-06-13 04:21:09.369567: I tensorflow/stream_executor/platform/default/dso_loader.cc:48] Successfully opened dynamic library libnvinfer_plugin.so.7
>>> from tensorflow import keras
>>> from tensorflow.keras import layers
>>> from tokenizers import BertWordPieceTokenizer
>>> from transformers import BertTokenizer, TFBertModel, BertConfig
Warning: please export TSAN_OPTIONS='ignore_noninstrumented_modules=1' to avoid false positive reports from the OpenMP runtime.!
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
ImportError: cannot import name 'TFBertModel'

This stack trace shows the problem when Python triggers a memory allocation request from C. Maybe one of the Bert components needs to be downloaded from git and compiled. If this functionality is wrapped inside a docker container on ngc then Python should be able to interface thru http post.

you may like to try bert container from

@Andrey1984 This may be one of the solution but not ideal

Correct one is loading the library :
export LD_PRELOAD=/usr/lib/aarch64-linux-gnu/libtcmalloc_minimal.so.4

Error mentioned by Jerry_gzy “portError: cannot import name ‘TFBertModel’” is because of wrong version of tersorflow. Install tensorflow 2