It’s a time, but it’s an unspecified one. It very specifically stated “say what time you want your reminder” after a location was given, indicating that doing it by location wasn’t an option. The user wasn’t paying enough attention.
I think what’s key here is that you used to be able to do this. I used to use Google assistant regularly and I feel like I’ve discovered dropped features through frustrated exchanges like this. It’s easy to miss that it specifically asked for time when you’re in autopilot mode and expecting that if there’s an error, it just misheard you
I see what you mean, on like the fourth try the assistant explicitly said “just give me the time”.
I guess it’s philosophical whether you say the user was “wrong”. Continuing to ask wasn’t going to get them any closer to the reminder being set, so I guess you could call that “wrong” if you want to say they should be savvy enough to know that. They might have even known that but still carried on, I do that sometimes just because.
IMO this software should be pushed to adapt to natural language if they want to keep pretending it’s s m a r t. If you were asking this to a person (who was somehow always with you…) and they said “just tell me the time”, you’d say I don’t know what time it will be, just remind me whenever we get there!
this is what “the customer is always right” is supposed to mean, it doesn’t mean the customer can demand anything, it means that you can argue however much you want but in the end the customer wants what the customer wants and you can either try to please them or leave them unsatisfied.
looks like a lot of people here should never open a business as they’d stand there arguing with customers that what they want is WRONG and actually they should be wanting THIS OTHER THING instead…
They got angry because it forgot what the reminder was.
It asked for information A, then it asked for B repeatedly, and once that was finally settled (with an answer that is technically supportable but absolutely not what the user wanted) it then asked for A again.
If this were a real person, I would definitely be thinking “should I just ask someone else?”
It’s a time, but it’s an unspecified one. It very specifically stated “say what time you want your reminder” after a location was given, indicating that doing it by location wasn’t an option. The user wasn’t paying enough attention.
I think what’s key here is that you used to be able to do this. I used to use Google assistant regularly and I feel like I’ve discovered dropped features through frustrated exchanges like this. It’s easy to miss that it specifically asked for time when you’re in autopilot mode and expecting that if there’s an error, it just misheard you
I see what you mean, on like the fourth try the assistant explicitly said “just give me the time”.
I guess it’s philosophical whether you say the user was “wrong”. Continuing to ask wasn’t going to get them any closer to the reminder being set, so I guess you could call that “wrong” if you want to say they should be savvy enough to know that. They might have even known that but still carried on, I do that sometimes just because.
IMO this software should be pushed to adapt to natural language if they want to keep pretending it’s s m a r t. If you were asking this to a person (who was somehow always with you…) and they said “just tell me the time”, you’d say I don’t know what time it will be, just remind me whenever we get there!
this is what “the customer is always right” is supposed to mean, it doesn’t mean the customer can demand anything, it means that you can argue however much you want but in the end the customer wants what the customer wants and you can either try to please them or leave them unsatisfied.
looks like a lot of people here should never open a business as they’d stand there arguing with customers that what they want is WRONG and actually they should be wanting THIS OTHER THING instead…
The main reason I think of the user as “wrong” in this case is because they got angry at the end, fair or not!
They got angry because it forgot what the reminder was.
It asked for information A, then it asked for B repeatedly, and once that was finally settled (with an answer that is technically supportable but absolutely not what the user wanted) it then asked for A again.
If this were a real person, I would definitely be thinking “should I just ask someone else?”