Technology Areas

The i3B network measures with different sensing techniques the brain, body, behavior and (home) environment. It can be different type of behaviors like eating, buying, driving, walking, social interaction and stress.

The design of high-throughput and high-content measurement and analysis systems requires a multidisciplinary approach involving a range of technology areas. The participants in the i3B network bring together the following technologies as enablers for innovative product development:

Sensor technology
Design of small, low-cost, wireless sensors and body area networks for physiological measurements.

Eye tracking
Following of eye movements in stationary and ambulatory situations.

Audio signal processing
Analysis of human speech and vocalizations of animals for the purpose of classifying content and detecting emotions.

Physiological sensing
Pattern recognition in EEG, ECG, EMG, GSR, and other physiological signals.

Video technology
Analogue and digital video recording, storage and disclosure, decompression and compression.

Computer vision
Digital image processing with monocular cameras, stereo cameras and multi-camera set ups.

Video tracking
Following the movements of one or more persons or animals against static or variable backgrounds.

Head pose & body posture recognition
Recognition of specific head poses and body postures.

Gesture tracking
Recognition of gestures (head, arms, hands).

Facial expression analysis
Non-invasive measurement of facial expressions.

User-system interaction
Capture and interpretation of real-time keyboard and mouse events, detection of usage patterns and mental states.

Sensor fusion
Integration of the signals from several different types of sensors.

Multimodal data integration
Integration of data from different sources (sensors, video, system events, databases, etcetera).

Complex event recognition
Recognition of high-level (complex) events based on combinations of low-level (simple) events.

Design of virtual reality and augmented reality environments for research and training purposes.

Real-time feedback
Design of systems capable of processing very large numbers of low-level events in real time, detecting high-level events and on the basis of this giving feedback to a person or animal.

Actuators based on light, sound, scent, haptics, and digital displays.

Brain-computer interface
Pattern recognition in brain signals for the purpose of controlling actuators.

Design of robots capable of recognizing human behavior and responding adequately to this.