Designing Better Hot Cakes

Let’s face it. The smartphone market is in the blahs right now. Sales are tailing off as consumers hang onto older models, not convinced to spend hundreds of dollars for the latest gizmo that has multiple cameras or can fold up and slip into a jacket pocket. Having these Apple and Samsung models priced closer to or even at $1,000 has been the breaking point for many.

I’m certainly representative of this trend, as I’m still using a Samsung Galaxy V that I bought five years ago this month. The battery still holds a charge, the camera is so-so, but it still does what I need it to do. Although my constant battles with Swype could push me to finally make an upgrade. (That’s probably why I’m being bombarded with text messages from my carrier, offering me steep discounts on new models of Samsung and Google Pixel phones.)

Even so, smartphone developers continue to plug on. Samsung Display is developing an OLED panel with a laser-drilled hole to accommodate a front-facing camera, according to the Korean Web site The manufacturing process is known as HIAA, or Hole In Active Area, and Samsung Display recently installed HIAA equipment at their A4 facility in Chungnam, aiming for mass production of these panels to begin in 2020 with the technology incorporated in Galaxy Fold and G11 models.

According to The, the A4 Chungham fab has two Gen 6 (1500mm × 1850mm) OLED production lines. Production capacity is 30,000 sheets per month, based on substrate input. Instead of a separate hole for the camera as used on the Galaxy S10 and Galaxy Note 10, a transparent display will be employed.

There are a couple of issues the company needs to resolve. One is the efficiency of light transmission through the transparent display to the camera. Presently, transparent OLEDs have about 40% maximum light transmission, which may be insufficient for the current generation of camera sensors. The other issue is a yellowish color cast on captured images that results from the polyamide layer in flexible OLED displays, something that would need correction in the camera’s software.

This R&D effort is just one of many at Samsung, which has seen its most recent quarterly profits take a hit to the tune of -56% Y-Y, and which just announced a major investment of $11 billion in QD-OLED display manufacturing technology for televisions, moving away from the zero-sum LCD panel manufacturing business.

On another front, Sentons USA has started licensing its Surface Wave ultrasonic multi-touch technology to smartphone manufacturers. Their IP makes it possible to control various functions on smartphones by touching just about any part of the phone, front and rear.

Ultrasonic detection makes it possible to have squeeze sensors on the sides of phones, deploy tap and button sensors to replace mechanical buttons, incorporate hand position sensors to recognize how a user is holding a phone, and provide swipe sensors and sliders on the phone enclosure surface.

In a Fast Company article from October 17, the author describes using a Surface Wave-equipped phone: “…With this technology, which uses ultrasonic waves to detect the position of your fingers, you could scroll through menus simply by rubbing your finger where it usually sits naturally: along the side of the phone. When taking a picture, you could press lightly on the side of the phone to focus your picture, and then press harder to snap the image, with no button needed—just like a point-and-shoot camera. Then, you could slide your finger along the back of the phone to flip through your photos.”

Engineers at Sentons originally tried working with conventional RF waves to implement the concept, but discovered that ultrasonic waves worked more reliably and used much less power. According to Sentons CEO Jess Lee, company engineers realized that by lining a phone’s body with small components that oscillate at very low frequencies, their algorithm could detect any disturbances to the sound waves, such as your finger. And unlike RF signals, ultrasonic waves can work through metal.

The company is also investigating applications in wearables and cars. One application would be to build sensors directly into steering wheels so you won’t need to take your eyes off the road to look at the dashboard or a control console. Presumably, you’ll still need to look at something to view the results of a tap or push, such as directions, your gas gauge or battery remaining, or distance to your destination. (although it will probably be in a HUD – Editor)

Whether these advances succeed or fail in stimulating more smartphone sales remains to be seen. It seems to me that the whole concept of a smartphone has advanced considerably since the first iPhone came to market well over a decade ago, and that we might need to come up with a better name for these compact, personal mobility, communication / photography / video / organization / lifestyle gadgets in the next decade… (PP)