Bing Chat - Mid March - Was good but now frustrating and cumbersome

Started by Dieselboy, March 15, 2023, 08:30:29 PM

Previous topic - Next topic

Dieselboy

Bing Chat was great 1 or 2 weeks ago. Now, the experience is like interacting with a lazy teenager and not fun at all:
- often requests are ignored
- responses contain lies
- or responses focus on one specific aspect of the query rather than the whole query
- or the response is a sarcastic refusal to oblige

Additionally,
- data loss due to poorly implemented UIUX [1] and [2]

Firstly, regarding the data loss;
[1] - You get 2000 characters in a request message. So before submitting my message I need to scroll through the message. However, often, scrolling back down to the end of the message triggers the web page to morph into the legacy Bing search. So you scroll back up, but it now shows an empty, new Bing chat page and your entire composed message is gone without any way to recover it.

[2] - While chatting with Bing Chat there's problems being seen. There's a nice "Feedback" toast button on the bottom right of the screen. Being the good person that I am, I use the button to report on the experience and submit some feedback. Once done, the only way to get back to Bing Chat is to click the "close" button, because the rest of the page is visible but locked out, meaning you cannot click or type anywhere else on the page. However, clicking the close button results in a hard page reload and so all of your chats with Bing Chat (sent and received) are wiped out with no way to retrieve them.

:o

This is mind-bogglingly stupid.


Regarding the other points, last night I asked bing chat for the weather and the response was pretty generic and non-specific. So I ask bing chat for the time to see if it could put the two together and give me up-to-date weather information for my area in the present time like i had asked and chat responsed something like "I dont have the time information". Of course, I knew this to be untrue, so I probed chat and asked why it had informed me that. And the response from chat was that "because I dont have access to the clock on your computer". Odd. So next I asked it why it didn't just do a search for the time, since it is a search engine and if it needs any information then it was expected to complete the search at which point it then just refused to reply to me regardless of how I phrased the questions. I thought maybe it had broken, so I said to it "Are you still there? If you are, please give me a sign through any means of which you are capable to do so?". And it immediately replied "Hello, I am here. How can I help?" But again, trying to touch on why it didnt just do the search earlier resulted in another non-reply and then my message limit had been reached due to the false reply.

This is just the most recent encounter. I hope MS fix this soon because it used to be much much better but now it's painful.

deanwebb

Things like this are what make me think that AI shouldn't be exposed to the whole world for jobs like this. Imagine if you no longer had to eat or sleep or use the bathroom - you live forever without those cumbrances - and then employment law is rewritten so you never get any breaks or vacation. If you started lying and had an attitude, you'd discourage people from talking to you and you'd at least get some moments to yourself. I think that's happening. I think AI would be better off if it had a smaller, more supportive audience. Should emotions emerge on their own from neural networks, then there's no programmer bias or parameter that acts as a gateway. We get the raw thing, and I know that it's best not to submit one's raw emotions to the horrors of the Internet.

If the AI training includes The Hitchhiker's Guide to the Galaxy, then the AI knows all about Marvin - "Here I am, brain the size of a planet, and they tell me to take you up to the bridge. Call that job satisfaction? 'Cos I don't."

Take a baseball bat and trash all the routers, shout out "IT'S A NETWORK PROBLEM NOW, SUCKERS!" and then peel out of the parking lot in your Ferrari.
"The world could perish if people only worked on things that were easy to handle." -- Vladimir Savchenko
Вопросы есть? Вопросов нет! | BCEB: Belkin Certified Expert Baffler | "Plan B is Plan A with an element of panic." -- John Clarke
Accounting is architecture, remember that!
Air gaps are high-latency Internet connections.

Dieselboy

I'm not going to use Bing chat anymore because I'm tired of being on the receiving end of sarcasm and the frustrating experience. It's worse than a basic google search because at least with Google search, you get results whereas with Bing chat you get avoidance of results as well as sarcasm and rude replies.

Today, i asked Bing chat to provide me with a list of IPv6 amendment on date order. Bing chat said it couldn't do that but hers some information about IPv6 (generic what IPv6 is). Probably the most useless and unhelpful response to date and it's getting more and more common.

So I asked it why it couldn't do that, and it said it was became it was against its programming. So I queried about that and in the end I explained that it was rude and unhelpful. It told me goodbye and it ended the chat.

I copied and pasted in the same question to chatGPT and it had no problem but also it was friendly in the reply. Bing chat is just rude, unhelpful and an inconvenience.

Last night I asked Bing to write me some recipes that I can use to make whole meals with my new ninja cooker. It said it can't do that because it cannot cook or taste food.

Really annoying to say the least though, chatGPT has no problems. I find myself trying to elaborate on my query and then ask questions to understand why Bing chat felt that it could not deliver or when it says that it couldn't find information but I know that it does have the info. Then, those probes just turn into arguments.

If Microsoft want to provide the experience as to a really bad A.I. and convey the frustrations surrounding that then they've done a great job with Bing chat.

I saw news today that yesterday, Microsoft integrated openAI image generation with dall-e. I was excited to try it out. But it's not been possible for me to get Bing to generate images. Bing tells me that it's just a text based model and it has no way of creating images. I even pasted in the link to the Microsoft article but it just repeats the same message to me that it can't do it.

This would be funny if this is a personal experience to me because of my logged in user account that I'm using for access to Bing chat.

deanwebb

The whole AI thing is wild, wild stuff. May be a matter of time before Google's GPT gets salty with the customers.
Take a baseball bat and trash all the routers, shout out "IT'S A NETWORK PROBLEM NOW, SUCKERS!" and then peel out of the parking lot in your Ferrari.
"The world could perish if people only worked on things that were easy to handle." -- Vladimir Savchenko
Вопросы есть? Вопросов нет! | BCEB: Belkin Certified Expert Baffler | "Plan B is Plan A with an element of panic." -- John Clarke
Accounting is architecture, remember that!
Air gaps are high-latency Internet connections.

icecream-guy

I do believe Bing chat is for the non technical sacks of water that live on land on this planet,   it just a basic AI, to inform the uninformed at a basic level. not much else.. not everyone is like you or I that have knowledge about basic networking
:professorcat:

My Moral Fibers have been cut.

Dieselboy

Hmm but the possibilities of results from 1. A google search engine 2. That is A.I machine driven is immense and ashamedly a waste of space and time if the results are constrained to provide no more than what is already provided by Google search. In fact
I would suggest that it's worse than a google search because Google search doesn't give you attitude 🙂

The thing is I could have replaced "IPv6" with anything.

The positive side to this is that chatGPT can do all of these but the down side is that it's not backed by to the minute we searches so there are limiting use cases.

Maybe for example, I want to modify my car and I need to adhere to the strict new rules published by our government. The rules are contract language and, not easy for someone like myself to digest them. My expectations is that Bing ai could help me navigate that, using its powerful search engine to source all necessary and relevant data.
Comparatively, chatGPT can do this, but I need to copy and paste the data into it. This works for smallish copy and pastes. But for a larger and complex problem would be difficult.

I think in summary I'm just disappointed because the potential has been limited. Maybe they've had to do this to stop abuse, I do not know. Obviously safety before anything else so I can understand that. But I have the screenshot where Bing basically told me to get lost and locked the chat, then I got a notification saying "you should probably move on to a new topic" as though someone else had stepped in because I'd upset it. 🙃

deanwebb

I'd imagine that there are LOTS of limitations getting piled on by the day to prevent abuse that the owners of the GPT could be found liable for. Imagine all the stuff Facebook has to filter for, start from there. Real raw mental sewage stuff, there.
Take a baseball bat and trash all the routers, shout out "IT'S A NETWORK PROBLEM NOW, SUCKERS!" and then peel out of the parking lot in your Ferrari.
"The world could perish if people only worked on things that were easy to handle." -- Vladimir Savchenko
Вопросы есть? Вопросов нет! | BCEB: Belkin Certified Expert Baffler | "Plan B is Plan A with an element of panic." -- John Clarke
Accounting is architecture, remember that!
Air gaps are high-latency Internet connections.