Intelligent Proximity for Telepresence endpoints

TC 7.1.2 code was recently posted to CCO.  This version of code supports the experimental (or feature preview) “Intelligent Proximity” feature.  This feature allows your Apple iPad or iPhone (iOS 7+) to pair with the endpoint (C, EX, SX, MX-series) using “ultrasound” (~21kHz), perform simple call control (dial/answer), and view content being shared by the device on your iPad/iPhone.

Administrator Configuration

  • Login as administrator to the telepresence codec that you want to enable the feature on.
  • Make sure it is on 7.1.2 code.
  • Under System Configuration search for byod
  • Turn the Mode to On

Note: the ultrasound volume is independent of the normal audio volume.  There are CLI configuration commands to change the volume of the ultrasound pairing signal.  The default is optimized for stand-alone (e.g. MX-series) room units.  I’ve had no problems making it work on a C40, SX20, Profile 55, MX 300 G2, and MX200.  If your codec is connected to an amplified room audio system, you may need to change the volume if you are having inconsistent results.

Ultrasound is used just for the pairing of the iOS device, not for content sharing or codec control.  Those functions are done over IP via the WiFi network.  Therefore, your iOS device will need to be on a WiFi network that has connectivity to the codec.

The ultrasound signal runs at about 21kHz which typically is a high enough frequency to stay in a single room.  It actually contains an encoded security token that validates that you are in the room currently.  If the ultrasound signal is no longer sensed, the IP connectivity will be dropped.  This is to prevent a user from seeing content on their iOS device without being present in the room.


End-user configuration/usage

Search for Proximity on the App Store and install it (free application).

Join a WiFi network that has connectivity to the Telepresence Codec.

Enter the room that the codec is in they want to pair with.  The Proximity app asks for permission to use their microphone.  This is a requirement so that the iOS device can detect the ultrasound signal from the codec.  If they deny permission to use the microphone the app will not work.

When the app hears the ultrasound it will connect via IP to the codec.  IF the codec is not in a call they will be allowed to dial or answer.  If the codec is in a call and content is being shared, they will see the content on their device.  Depending on the number of snapshots configured in the BYOD section, the user will be allowed to scroll back to see content they may have missed (if they came into the meeting late), or want to refer back to.

It’s very cool!