Skip header content and main navigation Binghamton University, State University of New York - Watson Old

The Seymour Kunis Media Core

Gary Kunis '73 named the Seymour Kunis Media Core in honor of his father with a gift to the Watson School Equipment Endowment.

The Seymour Kunis Media Core provides facilities for research in multimedia, including multimedia security, multimedia forensics, biometrics, steganography and steganalysis, immersive displays, and virtual and augmented reality. 

Explore the lab:

Laboratory Features at the Seymour Kunis Meida Lab

Research Projects

 

AINT Steganographic Operating System

AINT, a steganographic operating system, is a virtual OS that is encrypted and scattered within ordinary files like PDF documents. When a user types a common terminal command, AINT unpacks, decrypts and runs itself, creating a local web server that serves a Web 2.0 virtual desktop to a web browser in private browsing mode. This allows the user to view and edit hidden files, and run hidden applications. The AINT architecture has the advantage that there is no suspicious software footprint that would suggest the existence of hidden files.

Bacterial biofilms formation and 3D structure

An ongoing Howard Hughes Medical Institute (HHMI) project with the Biology department, featuring one undergraduate and one PhD student from each department, is using the media core to analyze microscope images of bacterial biofilms. The purpose of this project is to use image processing and signal processing techniques to better understand the process of biofilm formation, and to better understand the 3D structure of biofilms.

Eye and face tracking with an active camera.

New eye gaze tracking technology has been developed that does not require infrared cameras. This eye gaze tracking has a unique feature that a user's 3D eyeball is estimated and the fovea and iris positions are also estimated for constructing a viewing vector. Such an eye gaze tracking technology can be used for many applications including attention estimation, fatigue detection, eye-reading and viewing analysis, eye and computer interaction, etc. This work has been supported by the Air Force Research Lab and National Science Foundation. Three papers have been published from this work, and a patent has been filed through the Technology Transfer Office.

Head gesture tracking and interacting for face recognition

An active camera locates a person and automatically focuses (zooms) on the person's face for feature tracking. The 3D head pose is also estimated and tracked in real-time. The head gesture is used to control and interact with machines for operations such as real-time head painting. Two papers have been published from this work.

Eye and hand pointing tracking in a smart room setup

A finger pointing system will be demonstrated. The 3D Hand and finger will be tracked using two regular cameras. The finger pointing direction is to control the button clicking as a nature gesture interaction in an attempt to replace mouse and keyboard. Applications include developing the next generation of human computer interface and ultimately actions such as using finger pointing as a means of TV remote control, etc.  Additionally, the hand gesture tracking has been extended to hand gesture recognition for sign understanding.  This work was supported by the Air Force Research Lab, and two publications have been generated. A patent has also been filed for this work through the BU Technology Transfer Office.

Face and facial expression recognition with an adaptive avatar behavior generation

This demo will showcase the newly developed facial expression recognition system. Human facial expressions are tracked and estimated in real time (e.g., happiness, angry, sadness, surprise, etc.) The estimated expressions are used to control an avatar for expression mimicking and expression response. On the near horizon is enabling computers to recognize the user's emotional state. Automating the analysis of rapid facial behavior could help understand human's emotion. It could help design a new generation of human computer interface, teach parents about their babies, reveal patients' moods to doctors and even root out liars.

There are many potential applications, including in healthcare, diagnosis, pain management, communication, and assistive living, security, military, education (e.g., monitoring and analyzing students performance in class and using advanced technology to improve learning and teaching), training, entertainment and gaming, and retail and advertising.

This work has been supported by NSF and NYSTAR. A technology disclosure has been filed to the BU Technology Transfer Office. Ten publications have been generated in top-notch conferences and journals.

3-D Mask Illusion

This is the work of a senior design team in the 11-12 academic year. Video footage of a face is projected onto a concave face mask, creating the illusion of a face that follows the viewer.

The student team built a vacuum-forming machine, and wrote software to distort face images so that selected facial features are precisely projected upon corresponding features of a face mask. They also conducted research to analyze the cause of the face mask illusion, and its effectiveness at different ranges for both animated faces and still images.
Applications include interactive kiosks, and possibly general-purpose back projection displays.

Connect with Binghamton:
Twitter icon links to Binghamton University's Twitter page YouTube icon links to Binghamton University's YouTube page Facebook icon links to Binghamton University's Facebook page Instagram

Last Updated: 4/22/14