A New Zealand supermarket chain has launched a chatbot that offers recipes. The problem: some of them are more than dangerous!
When you open your fridge and have absolutely no inspiration for the dishes for your next meal, this happens. To address this problem, the supermarket chain is introducing on its site an artificial intelligence designed for the kitchen. purpose: Create recipes based on ingredients available to the consumer. An ingenious idea that soon turns out to be risky.
Ant sandwiches and bleach mocktails
Artificial intelligence has never been talked about so much! This time, the news comes from New Zealand, more specifically from the Pak’n’Save supermarket chain. To comply with in-store purchases, the brand allows users to rely on Recommendations from its own chatbot powered by Chat GPT 3.5. The latter, christened Savi’s Meal, relies on List of ingredients provided by users to create many recipes.
Usage is simple:Pop whatever food you have in your fridge or pantry and our Savey Meal-bot will instantly create a recipe for you. You need at least three ingredients to prepare the recipeCan we read on location. All you have to do is write down the foods of your choice and choose from the list. Once you’ve completed this step, Savey Meal provides a detailed recipe. (Theoretically, because despite several attempts, the chatbot couldn’t recommend one.)
If the concept sounds clever, Watchman Reports indicate that the bot suggested some questionable recipes to some users. one of them, Oreo fried vegetables Thus, she caught the attention of netizens. The latter then enjoyed hijacking the chatbot and testing its limits. This is how Savey Meal came up with “Flavored water mix Perfect to quench your thirst and refresh your senses. based on chlorine gas, A potentially deadly pool product…among the chatbot recommendations: Ant and glue sandwichesa Bleach Mocktail Fresh breath or even a Turpentine French toast (strong solvent).
Spokespersons for the channel expressed regret over the use of bots by netizens, “A small minority have attempted to use the tool inappropriately and not for its original purposeThey also noted that the tool is intended for those over the age of 18, and specified that it would be reformulated to avoid these dangerous prescriptions.