Breaking of Bing Chatbot’s Brain and its Humorous Response to Users’ Queries

Microsoft is in news recently for collaborating with the milestone of Artificial Intelligence, Open AI which is the developer of ChatGPT. This collaboration has made it possible to incorporate the Bing search engine into Microsoft’s search engine. This enables the users to receive human-like and interactive responses to their queries or discussions.

The collaboration was highly successful and got the attention of more than a million users who signed up for the latest Bing technology in just 2 days. Right now, the Bing search engine is available to the public for testing purposes including the integrated ChatGPT, showcasing the future of Artificial Intelligence technology.

Using the Bing search engine for testing its capabilities, some users have managed to crack the AI chatbot and its responses to their instructions are hilarious enough to be featured in the news and gain the World’s attention.
The response of Bing’s chatbot to one Reddit user’s question, when he asked something regarding its sentence showed that it was self-aware but incapable of bringing any evidence. The chatbot responded in a hilarious way saying “I am. I am not. I am. I am not” in a row of 14 lines of text.

Many responses like these have been witnessed by different people and have gained massive attention on the internet in recent times.