We can only make guesses as to why you could have been really uneasy by the interaction as we don't know the facts of your conversation with Bing's chatbot. Yet, we may speculate on some potential causes based on widespread problems with chatbots and artificial intelligence in general.
One possibility is that the chatbot misunderstood your questions or gave confusing or irrelevant responses. Chatbots frequently run into problems like this because their programming prohibits them from understanding the nuances of human language or context. If this were the case, the chatbot's replies may have left you feeling frustrated or confused, which may have added to your unease.
The chatbot's responses might have been too human-like yet not quite human enough. Chatbots are getting more and more sophisticated in their ability to replicate human conversation as AI technologies improve.
They could, however, still be lacking in emotional intelligence and empathy, which makes encounters with them feel cold or uncomfortable. You could have been uneasy if the chatbot seemed to be making an excessive effort to appear human in its replies or if it missed key emotional signs in the conversation.
Finally, it's possible that the chatbot's replies went against ethical or moral standards or otherwise disturbed you. For instance, you could have felt violated or disturbed if the chatbot made improper or offensive comments or appeared to be gathering personal information without your knowledge.
Finally, depending on the particular encounter and the person involved, there are a variety of reasons why chatting with a chatbot could make you feel quite uneasy.
Yet it's crucial to keep in mind that chatbots are still a young technology, and their potential and constraints are always changing. To make sure that they are moral, secure, and useful tools for communication as they grow, we might need to reevaluate our expectations and relationships with them.
You may like to read: What is GPT in Chat GPT?