Google Bard can write code in over 20 programming languages
Bard, Google’s AI-powered chatbot, is — as it stands — a big joke if Google employees are to be believed. They believe that the tool is “worse than useless” and one “pathological liar”.
Is Google Bard useless?
Bloomberg interviewed 18 current and former Google employees who had interactions with Bard, and shared findings from internal documents related to the chatbot.
Google’s request for its employees to provide feedback on Bard before its public launch was largely ignored. These include reviews where Bard has been labeled by employees of“unworthy” and of “pathological liar”. Another comment directly states that Google shouldn’t launch Bard: “Bard is worse than useless: don’t start him!” ».
An employee shared an experience with Bard before launch, where the AI was prompted for instructions on how to land a plane, but repeatedly gave instructions that would lead to a crash. Another experiment involved asking a question about scuba diving instructions. The employee asking the question indicated that the instructions given “would likely result in serious injury or death”.
Another problem concerns ethics. Google teams working on AI ethics have been “disempowered and demoralized” and officials told them “not to get in the way or try to kill the generative AI tools being developed”.
A Google spokesperson responded to the article, simply stating: “We continue to invest in the teams working to apply our AI principles to our technology”.
Google recently offered Bard as an experimental version, first in the United States. But it’s not very glorious at the moment, with many testers preferring ChatGPT or Microsoft Bing (which builds on ChatGPT).