Apple just upgraded its 800 number to use more advanced technology. If you call 1-800-MY-APPLE (1-800-692-7753) from Illinois (and a few other states) your call will be answered by the new service.
Instead of the old "press one for..." service, the new service accepts any spoken request and routes your call. You can ask "Where's the closest store to zip code 60645?", ask for help with a drive problem, or ask to purchase a Mac computer. The application doesn't actually do trouble-shooting or let you order something — it either gets you the address and number of a local store (and/or connects you to that store) or it routes you the proper human agent. The accuracy is very good, considering the complexity of the requests.
In fact, in my opinion it's a little too accurate to be a purely automated system. I suspect that it uses a mix of human and speech recognition. Of course, when it comes to customer service, I really don't mind if the application is better than expected.
Of course I fiddled with the system a little to see how well it works. As you can imagine I did find a few errors and gaps in the system.
Can you break the system too? Sure, if you want, but what's the point of breaking the system if you don't learn anything? We in the speech technology business have enough real problems without worrying about contrived ones.Overall, however, the system worked as well as can be expected for the first few days in service. Any new speech technology system (even one that uses humans for part of the recognition, assuming it does) requires tuning and tweaks; but so far, so good, and it'll be interesting to see how far Apple pushes this high-level speech technology — I'd be interested to hear how this technology would work to automate common help calls.
When I did a trial of one of my first speech recognition systems, a tester wrote to say that the system didn't recognize his Southern accent. I wrote back and asked if he really had a Southern accent — I already knew enough about non-expert testers to inquire — and he didn't. He faked an accent and (of course) the speech technology failed.
Experts generate useful failures, that provide information about the system to either the expert or the people maintaining it. Trust me, Apple will have a sufficient number of real errors and doesn't need your help generating contrived ones.