Posts Tagged ‘Apple sensors’

by Matt Magolis

Yesterday, I highlighted Apple’s mutli-touch sensor patterns and stackup patent that was issued on April 24th, 2014.  The newly issued patent claims indicate that Apple has put in some significant time and effort focused on improving the functionality of their multi-touch sensor patterns that will improve the user’s overall experience while handling a device with a curved screen. The number of individual claims has decreased from 39 to 14, however the precision of their 2014 claims versus the 2009 claims is astounding.  This newly issued patent might indicate Apple is very close to introducing a curved iPhone screen in the not to distant future.

Today, I wanted to share a few Apple job postings I found that indicate Apple’s focus on the “Human Interface” on Apple’s next-generation devices, which include the iPhone, iPad and Mac (iWatch not listed but we know it exists).  Apple is looking to create a completely new user experience that will that will further connect mankind to their portable electronic devices.

The first job postings goes back to January 5th, 2014 when Apple was recruiting for a Multitouch Sensor Algorithms Engineer.  The job focusing on gesture recognition, which is technology (Software) that typically allows humans to communicate with machines (Equipment) without any mechanical devices.  For example you could simply point you finger at the screen and the cursor would move.  The Multitouch Sensor Algorithms Engineer job description is below:

Be part of the engineering team creating next-generation input devices and displays. We are looking for a Gesture or Pattern Recognition Algorithms Engineer with working knowledge of gesture recognition, statistical signal processing and pattern recognition along with strong programming skills.




Apple posted a position for a HID Multitouch Sensor Algorithm Manager on March 7th, 2014.  The position is focused on the “next-generation Human Interface Devices for iPhone, iPad, and Mac.”  The position will focus on creating “insanely great” sensing experience and focus on “groundbreaking sensing technologies”.  You can take my word that Apple is developing mankind’s “6th sense” and it appears Apple is going to take the human-being user experience to another dimension across Apple’s next-generation of devices.

Come lead a team of creative, energetic, and enthusiastic algorithm engineers responsible for next-generation Human Interface Devices for iPhone, iPad, and Mac. We are looking for talented individuals who enjoy creating pattern recognition and statistical signal processing algorithms, prototyping concepts, investigating new technologies, and ultimately delivering high-quality products. In this role, you will be at the focal point of many cross-functional interactions with the hardware, software, QA, industrial and UI design teams.

Lead the team responsible for developing “insanely great” sensing experiences for Apple products.
Spearhead the development and integration of new technologies from proof of concept phase through production ramp.
Invent algorithm architectures for groundbreaking sensing technologies.
Focus the team on parsimonious methods to achieve fluid customer experience.
Identify, create and document new sensing system metrics and specifications that deliver performance that matter to customer perception.
Simulate sensing system performance and error sensitivities.
Diagnose engineering issues with true scientific method, driving to definitive root cause, in close collaboration with cross-functional team members.