搜索
当前位置:首页 >時尚 >【】

【】

发表于 2024-12-22 20:39:12 来源:粉妝玉砌網

It's hard to believe it's only been about a week since Microsoft debuted the ChatGPT-enhanced Bing.

A select group of testers were granted early access to play with the new Bing and Edge browser, now integrated with OpenAI's conversational AI technology. Since then, the internet has been flooded with conversations with the chatbot that range from professing its love to New York Timescolumnist Kevin Roose to adamantly claiming the year is 2022 and not backing down. For a list of Bing's meltdowns, we recommend Tim Marcin's roundup.

SEE ALSO:Microsoft's Bing AI chatbot has said a lot of weird things. Here's a list.

Naturally, when testers got their hands on the new Bing, they were determined to poke holes in its intelligence and map out its limitations. And boy, did they accomplish this. While that might not seem like a good look for Microsoft, it's all part of the plan. A critical aspect of developing a language learning model is to give it as much exposure and experience as possible. This allows developers to incorporate new feedback and data, which will make the technology better over time, like a mythical being absorbing the strength of its vanquished enemies.

Microsoft didn't exactly put it in those words in its blog post on Wednesday. But it did reiterate that Bing's chaotic week of testing was totally supposed to go down that way. "The only way to improve a product like this, where the user experience is so much different than anything anyone has seen before, is to have people like you using the product and doing exactly what you all are doing," said the Bing blog.

But the bulk of the announcement was devoted to acknowledging Bing's wacky behavior this week and solutions to address them. Here's what they came up with:

Mashable Light SpeedWant more out-of-this world tech, space and science stories?Sign up for Mashable's weekly Light Speed newsletter.By signing up you agree to our Terms of Use and Privacy Policy.Thanks for signing up!

Improving searches that require timeliness and accuracy

Microsoft shared that providing the correct citations and references has been generally good. But when it comes to checking the live score in sports, providing facts and numbers concisely, or ahem, the correct year we're currently living in, it needs some work. Bing is increasing the grounding data fourfold and is considering "adding a toggle that gives you more control on the precision vs creativity of the answer to tailor to your query."

Fine-tuning Bing's conversation skills

The chat feature is where a lot of the mayhem has occurred this week. According to Bing, this is largely due to two things:


Related Stories
  • OkCupid debuts ChatGPT-generated matching questions
  • Try 'the new Bing' ahead of the official launch. How to preview the AI-powered search engine.
  • The Clippy of AI: Why the Google Bard vs. Microsoft Bing war will flame out

1. Long chat sessions

Chat sessions that go beyond 15 or more questions that confuse the model. It's unclear if this is what might trigger dark musings from its villainous alter-ego Sydney, but Bing says it will "add a tool so you can more easily refresh the context or start from scratch."

2. Mirroring the user's tone

This might explain why Bing chat has taken an aggressive tone when asked provocative questions. "The model at times tries to respond or reflect in the tone in which it is being asked to provide responses that can lead to a style we didn’t intend," said the post. Bing is looking into a solution that will give the user "more fine-tuned control."

Fixing bugs and adding new features

Bing says it's continuing to fix bugs and technical issues and is also thinking about adding new features based on user feedback. That might include such as booking flights or sending emails. and the ability to share great searches/answers.

TopicsArtificial IntelligenceChatGPT

随机为您推荐
版权声明:本站资源均来自互联网,如果侵犯了您的权益请与我们联系,我们将在24小时内删除。

Copyright © 2016 Powered by 【】,粉妝玉砌網   sitemap

回顶部