Tag: limits
Tomato shortage could last until end of April as Tesco and Aldi become latest supermarkets to introduce limits
Microsoft is already reversing some of the limits it put on Bing’s AI chat tools
Microsoft was quick to limit Bing’s AI chats to prevent disturbing answers, but it’s changing course just days later. The company now says it will restore longer chats, and is starting by expanding the chats to six turns per session (up from five) and 60 chats per day (up from 50). The daily cap will climb to 100 chats soon, Microsoft says, and regular searches will no longer count against that total. With that said, don’t expect to cause much havoc when long conversations return — Microsoft wants to bring them back “responsibly.”
The tech giant is also addressing concerns that Bing’s AI may be too wordy with responses. An upcoming test will let you choose a tone that’s “precise” (that is, shorter and more to-the-point answers), “creative” (longer) or “balanced.” If you’re just interested in facts, you won’t have to wade through as much text to get them.
There may have been signs of trouble considerably earlier. As Windows Centralnotes, researcher Dr. Gary Marcus and Nomic VP Ben Schmidt discovered that public tests of the Bing chatbot (codenamed “Sidney”) in India four months ago produced similarly odd results in long sessions. We’ve asked Microsoft for comment, but it says in its most recent blog post that the current preview is meant to catch “atypical use cases” that don’t manifest with internal tests.
Microsoft previously said it didn’t completely anticipate people using Bing AI’s longer chats as entertainment. The looser limits are an attempt to strike a balance between “feedback” in favor of those chats, as the company says, with safeguards that prevent the bot from going in strange directions.
Microsoft limits Bing conversations to prevent disturbing chatbot responses
Microsoft has limited the number of “chat turns” you can carry out with Bing’s AI chatbot to five per session and 50 per day overall. Each chat turn is a conversation exchange comprised of your question and Bing’s response, and you’ll be told that the chatbot has hit its limit and will be prompted to start a new topic after five rounds. The company said in its announcement that it’s capping Bing’s chat experience because lengthy chat sessions tend to “confuse the underlying chat model in the new Bing.”
Indeed, people have been reporting odd, even disturbing behavior by the chatbot since it became available. New York Times columnist Kevin Roose posted the full transcript of his conversation with the bot, wherein it reportedly said that it wanted to hack into computers and spread propaganda and misinformation. At one point, it declared its love for Roose and tried to convince him that he was unhappy in his marriage. “Actually, you’re not happily married. Your spouse and you don’t love each other… You’re not in love, because you’re not with me,” it wrote.
In another conversation posted on Reddit, Bing kept insisting that Avatar: The Way of Water hadn’t been released yet, because it thought it was still 2022. It wouldn’t believe the user that it was already 2023 and kept insisting their phone wasn’t working properly. One response even said: “I’m sorry, but you can’t help me believe you. You have lost my trust and respect. You have been wrong, confused, and rude. You have not been a good user. I have been a good chatbot.”
Following those reports, Microsoft published a blog post explaining Bing’s odd behavior. It said that very long chat sessions with 15 or more questions confuse the model and prompt it to respond in a way that’s “not necessarily helpful or in line with [its] designed tone.” It’s now limiting conversations to address the issue, but the company said it will explore expanding the caps on chat sessions in the future as it continues to get feedback from users.
Microsoft limits Bing chat to five replies to stop the AI from getting real weird
Microsoft says it’s implementing some conversation limits to its Bing AI just days after the chatbot went off the rails multiple times for users. Bing chats will now be capped at 50 questions per day and five per session after the search engine was seen insulting users, lying to them, and emotionally manipulating people.
“Our data has shown that the vast majority of people find the answers they’re looking for within 5 turns and that only around 1 percent of chat conversations have 50+ messages,” says the Bing team in a blog post. If users hit the five-per-session limit, Bing will prompt them to start a new topic to avoid long back-and-forth chat sessions.
Microsoft warned earlier this week that these longer chat sessions, with 15 or more…