I-Zone Review at Display Week 2016

Here is a run-down on three more iZone technologies shown at Display Week 2016, that we wanted to bring to your attention.

Synaptics was in the iZone with their latest automotive technology concept prototype called Torch. This is what the group called an Automotive Human Machine Interface concept, and came in the form of a steering wheel application, which gives force feedback or a haptic enabled “ClearForce” (trade name) used in driving applications. Haptic-enabled immersion offers the advantage of avoiding accidental activation, gesture validation and improved “no-look” operation.

Along with this they also showed a natural ID finger print sensor technology that can also be modified for the automotive market. They called this FPS (short for fingerprint sensor) and it’s a biometric authentication user interface navigation concept. What was shown was a device with a built in “D-Pad” with navigation capabilities. This includes in car payments (with valet mode) and user specific personalization features. Synaptics claims this helps reduce driver distraction, enabling the driver to keep both hands on the steering wheel (presumably while shopping and driving??)


We touched on the technology in a February company update story (Synaptics Developing Integrated Touch & Drivers), so it was interesting to see it come out of the lab and onto the Display Week iZone show floor.

Another interesting prototype found in the iZone was Lumii, an MIT spin-off that we covered back in Nov 2015, for winning a State of Mass seed money grant. See that coverage by Matt Brenesholtz here: Lumii Wins Grant for Light Field Technology.

In the iZone, the group showed its light field engine that accepts 3D models and computes a set of unique patterns that can be printed and stacked, then illuminated by a backlight. This gives off a multi-view light field according to the group, that can be of any size or dimension. The technology offers high resolution, large size, is bright, offers full parallax and is easy to produce.

As Matt indicated, there is a YouTube video on line for more details at (

Maradin showed a laser imaging system based on laser scanning and detection. The components included an advanced photodiode sensor (APD) that receives the image from a MAR1100 2D MEMS scanner mirror device that floods the object with IR laser light. Data from the image is sent from the sensor to an FPGA for processing and display onto a desktop monitor.

Maradin offers a full laser scanning solution based on a 2D MEMS mirror device powered by an RGB laser diode and optics.

Features of the device include: 2D single mirror the company said offers a small optical engine that increases optical efficiency, plus the use of both electrostatic (in the Horizontal plane) and electromagnetic (in the vertical plane) that serve to improve system robustness / performance. It also offers a rather wide optical field of view (FOV) of 45° (H) x 30° (V) delivering a large image/scan area from this small device. They also include a MEMS controller with standard digital video interface to help simplify system design and integration

Steve Sechrist

Analyst Comment

We have been reporting on Maradin since 2010. In checking this, I found a forecast from In-Stat that “Approximately 90% of mobile handsets will incorporate embedded pico projectors by 2014” – I bet that that forecast found its way into a few pitches to VCs, who may have been somewhat disappointed! (BR)