Face Value: How New Signage Technologies Deliver Insights
Digital signage has proven to be an exceptionally effective way to deliver information to a target audience. Now it can also be used to gather information about that audience.
A new generation of signage solutions featuring integrated cameras, embedded sensors, speech recognition software and AI-powered analytics platforms can help organizations gather valuable insights about their customers. Signs with these advanced technologies can capture a range of characteristics such as gender, age, body style, clothing style — even mood.
As organizations collect and analyze such data over time, they can learn more about who their customers are, what they like and what they don’t like. Those data points can then trigger content personalized for the individual on the other side of the screen. In a retail setting, for example, the data might prompt messaging that promotes products likely to appeal to the customer’s general demographic profile.
In addition, an organization could analyze large sets of customer demographic data to fine-tune marketing campaigns, understand customer traffic patterns, improve store layouts, adjust staffing levels and evaluate product placement.
The data can even deliver an assessment of the signage strategy itself by identifying how many people are interacting with your displays and for how long. Facial recognition technology can determine audience dwell time — how long people look directly at a sign. According to a recent study, the average dwell time for digital signage is 4.6 seconds. Anything over 10 seconds would suggest your signage strategy is quite effective.
Here’s a brief look at some of the ways digital signage can be used to capture important audience data:
AI-powered proximity sensors can detect when someone is nearby, and integrated cameras capture images of the person’s gestures such as swiping, scrolling, rotating, pinching or tapping. Software then matches the images against a prepopulated gesture library. Once the system interprets the gesture, it executes a command to perform the action associated with that gesture. By tracking such interactions, organizations can gain insight into which products, services or messages generate significant audience interest.
Interactive displays with embedded cameras and proximity sensors can determine when someone is actively engaged with the signage messaging. The display can then scan individuals for a range of characteristics such as gender, age, body style, clothing style — even mood. After the system captures these identifiers, a machine-learning algorithm will analyze the data, match it to a database of customer profiles and preferences and then trigger messaging with various purchasing recommendations.
Emotion recognition software works with facial recognition systems to identify a range of expressions such as frowns, smiles, raised eyebrows, furrowed brows and more that would indicate the mood or emotional state of an individual. Machine-learning algorithms then compare key elements of these images against a database of human facial expressions to suggest whether a customer is happy, angry, surprised, confused or bored. These capabilities are particularly useful in healthcare, security and market research applications.
Over the past several years, digital signage solutions have become important vehicles for customer interactions. With increasingly sophisticated features, they now can capture and analyze a good deal of data generated by those interactions.
SageNet is uniquely qualified to help your organization take advantage of these emerging capabilities. Through our managed SageVIEW services, we can design, deploy, manage and maintain a digital signage system that can help you make better connections with your customers.
Interested in what our experts had to say?
Learn more about our services - all driven by the changing technology landscape.