14. Tablets: Changing the Definition of Mobile Devices Fewer hardware limitations Larger screen displays more participants Greater CPU Adoption in verticals seeking video conferencing Telemedicine will be $3.6 billion annual market in next 5 years (Pike and Fischer) Major smartphone players entering market Apple Google Microsoft
MID = Mobile Internet Device (larger than a smartphone, smaller than a netbook) Access to great applications a huge driver in the adoption of smartphones (see iPhone and Android in particular)3G still not reliable everywhere but rapid improvements are happeningWiFi access improves coverage for example in people’s homes and offer great access rates It is getting easier to port PC software to mobile devices due to increased CPU speeds and development environments that are very similar
Obviously there are a number of challenges associated with delivering a great video experience on a mobile deviceCPU limitations [Improving quickly, now many devices have 1 GHz CPUs]Limited and varying bandwidth [3G networks are improving, WiFi can be used as complement, good software will adapt rapidly to what is available]Packet loss and jitter on network [well designed media processing handles this with intelligent jitter buffers, packet loss concealment, and signaling]Camera access and placement (front facing?) [Many devices are lacking front facing camera and on iPhone no application access to the camera until iPhone OS 4.0, which was just announced]Noisy environment [Noise suppression can limit the impact] Small screen [When that is a serious issue the slightly bigger mobile devices offer much improvement] H.264 SVC (which we talked about at eComm Europe) is extremely helpful in getting the best possible quality experience on the mobile as well as for other participants in a video call. Each participant can get and deliver the best quality their device and network supports The technical solutions are here to offer a great experience but there are still concerns whether people really want this [see bullets in the slide plus the fact that it is hard to keep the camera still – which is annoying to the person on the other side of the call]
Video conferencing and collaboration on this type of devices significantly closer to desktop experience Not for all scenarios (these devices are not likely to be carried everywhere like a phone is) but great complement
Lack of front facing camera huge issue, which seems to get resolved this year The iPhone is interesting as the few models makes it easy for developers to develop and test on few devices but of course it is limiting with few devices from other perspectives (e.g. camera support and all the limitations Apple impose on the OS and devices) There are very few restrictions on how an Android device can look
The development environments are pretty good for both platforms The biggest issue on the iPhone side is the approval process for applications and limitations on how software can be distributed. Embedded software can only be sent to a known developer license (complicated but good in the sense that it limits distribution) and you are not allowed to share much information about the Apple APIs and your solutions The Android application environment is Java based and only through the NDK (Native Development Kit) can a developer build performance-critical portions of their apps in native code. Unfortunately, the debugger cannot step into the native code which makes it hard for the developer to find bugs Lack of documentation and some APIs that would make it easier to implement real-time media processing on these devices means that we have to be quite innovative and try a lot of different tricks to find a good working solution – something we have been able to do