Initializing hugging face transformer causes loops to restart. I have created simple loop which reads text and replies but the loop is restarting new thread when initalizing chatbot pipeline. Minimum replication example given below.
from transformers import pipeline
from transformers.pipelines import Text2TextGenerationPipeline
chatbot_model = pipeline(task="text-generation",model="facebook/blenderbot-400M-distill")
i=0
while(True):
try:
if (i%2)==0:
print(f"{i} - Even")
i = i+1
time.sleep(0.5)
else:
i = i+1
continue
except Exception as e:
pass
Output loops restart after chatbot initialization after printing 40 and seems other loop starts from 0. Same is when i read files in a loop. Is is due to to multiple threads. How is it working? Output mentioned below.
Device set to use cuda:0
0 - Even
2 - Even
4 - Even
6 - Even
8 - Even
10 - Even
12 - Even
14 - Even
16 - Even
18 - Even
20 - Even
22 - Even
24 - Even
26 - Even
28 - Even
30 - Even
32 - Even
34 - Even
36 - Even
38 - Even
40 - Even
The model 'TFBlenderbotForConditionalGeneration' is not supported for text-generation. Supported models are ['TFBertLMHeadModel', 'TFCamembertForCausalLM', 'TFCTRLLMHeadModel', 'TFGPT2LMHeadModel', 'TFGPT2LMHeadModel', 'TFGPTJForCausalLM', 'TFMistralForCausalLM',].
0 - Even
42 - Even
2 - Even
44 - Even
4 - Even
46 - Even