BAUM-2: a multilingual audio-visual affective face database

dc.contributor.author Eroglu Erdem, Cigdem
dc.contributor.author Turan, Cigdem
dc.contributor.author Aydin, Zafer
dc.contributor.authorID 0000-0001-7686-6298 en_US
dc.contributor.department AGÜ, Mühendislik Fakültesi, Bilgisayar Mühendisliği Bölümü en_US
dc.contributor.institutionauthor Aydin, Zafer
dc.date.accessioned 2024-06-12T08:20:00Z
dc.date.available 2024-06-12T08:20:00Z
dc.date.issued 2015 en_US
dc.description.abstract Access to audio-visual databases, which contain enough variety and are richly annotated is essential to assess the performance of algorithms in affective computing applications, which require emotion recognition from face and/or speech data. Most databases available today have been recorded under tightly controlled environments, are mostly acted and do not contain speech data. We first present a semi-automatic method that can extract audio-visual facial video clips from movies and TV programs in any language. The method is based on automatic detection and tracking of faces in a movie until the face is occluded or a scene cut occurs. We also created a video-based database, named as BAUM-2, which consists of annotated audio-visual facial clips in several languages. The collected clips simulate real-world conditions by containing various head poses, illumination conditions, accessories, temporary occlusions and subjects with a wide range of ages. The proposed semi-automatic affective clip extraction method can easily be used to extend the database to contain clips in other languages. We also created an image based facial expression database from the peak frames of the video clips, which is named as BAUM-2i. Baseline image and video-based facial expression recognition results using state-of-the art features and classifiers indicate that facial expression recognition under tough and close-to-natural conditions is quite challenging. en_US
dc.identifier.endpage 7459 en_US
dc.identifier.issn 1380-7501
dc.identifier.startpage 7429 en_US
dc.identifier.uri https://doi.org10.1007/s11042-014-1986-2
dc.identifier.uri https://link.springer.com/article/10.1007/s11042-014-1986-2
dc.identifier.uri https://hdl.handle.net/20.500.12573/2202
dc.identifier.volume 74 en_US
dc.language.iso eng en_US
dc.publisher Kluwer Academic Publishers(SpringerLink) en_US
dc.relation.isversionof 10.1007/s11042-014-1986-2 en_US
dc.relation.journal Multimedia Tools and Applications en_US
dc.relation.publicationcategory Makale - Uluslararası Hakemli Dergi - Kurum Öğretim Elemanı en_US
dc.rights info:eu-repo/semantics/closedAccess en_US
dc.subject Affective database en_US
dc.subject Audio-visual affective database en_US
dc.subject Facial expression recognition en_US
dc.title BAUM-2: a multilingual audio-visual affective face database en_US
dc.type article en_US

Files

Original bundle

Now showing 1 - 1 of 1
No Thumbnail Available
Name:
s11042-014-1986-2.pdf
Size:
5.67 MB
Format:
Adobe Portable Document Format
Description:
Makale Dosyası

License bundle

Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
1.44 KB
Format:
Item-specific license agreed upon to submission
Description: