My 2.5 year-old daughter has developed an interesting new behavior: The beeping of the coffee pot, the chirp of the microwave, the bloop-bloop of the TiVo causes her to ask, “What does that noise mean?” Rather than focus on the source of the sound (“What was that noise?”) she is trying to understand and parse the information carried by the sound: That three loud, long beeps means the coffee is ready, that two short, and one long beep means the microwave is done, the whoop-whoop-whoop means someone just bumped into an alarmed car.
Increasingly, wordless sounds are carrying information we can interpret. James Poniewozik, TV critic for Time Magazine, recently Tweeted: “Occupational hazard: need to change all my iPhone alert tones to sounds that are never used in TV episodes.” In a home in which various members are all part of the same technology ecosystem, the new message received tone often has two people scrambling for their phones. And as a user can set different sounds and tones to represent specific functions or people, the ambient sound information can start to get dense and complicated.
Adding to this complication is new start up called Chirp.io. Chirp is an app for the iPhone that encodes data—photos, contact lists, documents, web pages—as a two-second, multi-note tone that can be shared. Chirp works as a broadcast, rather than point-to-point, so anyone in listening distance would be able to receive whatever it is being shared. The broadcast aspect is touted as one of Chirp’s key features as a way to cut out the hassle of having to assign a specific address: If a friend (or anyone) nearby is running Chirp, they will get your message. Soon the air may be filled with the sound of a million chirping phones, making my daughters question of “What does that noise mean?” a lot harder to answer.