TheFacial Action Coding System (FACS) is a comprehensive, anatomically based system for describing all visually discernible facial movement. It breaks down facial expressions into individual components of muscle movement, called Action Units (AUs).
FACS is used across many different personal and professional settings. It is often used in various scientific settings for research. It is also used by animators and computer scientists interested in facial recognition.
The FACS manual describes the criteria for observing and coding each Action Unit. It also describes how AUs appear in combinations. The FACS manual was first published in 1978 by Ekman and Friesen, and was most recently revised in 2002. The Paul Ekman Group offers the manual for sale.
The FACS Final Test is the only standard for proficiency in FACS coding that is available. Anyone who wants to state that they know FACS and can code in FACS must pass the FACS Final test. After completing the self-study, or a workshop, you can take the final test for certification. The Paul Ekman Group offers the FACS test for sale.
FACS self-instruction usually takes about 50 to 100 hours to complete. If a user studies FACS five days a week for two hours a day, then learning will be closer to 50 than 100 hours. Dr. Ekman recommends training in groups. This can help make the high volume of information easier to learn.
We endorse Erika Rosenberg who teaches a five-day FACS workshop that takes students through the entire manual and prepares them for certification. Currently there are no online or in-person versions of FACS training that have been evaluated or approved by Paul Ekman.
3. Where can I find the original Ekman & Friesen 1978 article?
The original version of FACS was published in 1978. It was a manual, not an article, available for training as the current manual is. The original version is out of print, and techniques have been modified since then. The 2002 manual is the current version, and it is the only one that should be used for scoring today.
The Facial Action Coding System (FACS) (Ekman & Friesen, 1978) is a comprehensive and widely used method of objectively describing facial activity. Little is known, however, about inter-observer reliability in coding the occurrence, intensity, and timing of individual FACS action units. The present study evaluated the reliability of these measures. Observational data came from three independent laboratory studies designed to elicit a wide range of spontaneous expressions of emotion. Emotion challenges included olfactory stimulation, social stress, and cues related to nicotine craving. Facial behavior was video-recorded and independently scored by two FACS-certified coders. Overall, we found good to excellent reliability for the occurrence, intensity, and timing of individual action units and for corresponding measures of more global emotion-specified combinations.
The Facial Action Coding System (FACS) is a system to taxonomize human facial movements by their appearance on the face, based on a system originally developed by a Swedish anatomist named Carl-Herman Hjortsj.[1] It was later adopted by Paul Ekman and Wallace V. Friesen, and published in 1978.[2] Ekman, Friesen, and Joseph C. Hager published a significant update to FACS in 2002.[3] Movements of individual facial muscles are encoded by the FACS from slight different instant changes in facial appearance. It has proven useful to psychologists and to animators.
Using the FACS[5] human coders can manually code nearly any anatomically possible facial expression, deconstructing it into the specific "action units" (AU) and their temporal segments that produced the expression. As AUs are independent of any interpretation, they can be used for any higher order decision making process including recognition of basic emotions, or pre-programmed commands for an ambient intelligent environment. The FACS manual is over 500 pages in length and provides the AUs, as well as Ekman's interpretation of their meanings.
The FACS defines AUs, as contractions or relaxations of one or more muscles. It also defines a number of "action descriptors", which differ from AUs in that the authors of the FACS have not specified the muscular basis for the action and have not distinguished specific behaviors as precisely as they have for the AUs.
Although the labeling of expressions currently requires trained experts, researchers have had some success in using computers to automatically identify the FACS codes.[9] One obstacle to automatic FACS code recognition is a shortage of manually coded ground truth data.[10]
The original FACS has been modified to analyze facial movements in several non-human primates, namely chimpanzees,[13] rhesus macaques,[14] gibbons and siamangs,[15] and orangutans.[16] More recently, it was developed also for domestic species, including dogs,[17] horses[18] and cats.[19] Similarly to the human FACS, the animal FACS has manuals available online for each species with the respective certification tests.[20]
Thus, the FACS can be used to compare facial repertoires across species due to its anatomical basis. A study conducted by Vick and others (2006) suggests that the FACS can be modified by taking differences in underlying morphology into account. Such considerations enable a comparison of the homologous facial movements present in humans and chimpanzees, to show that the facial expressions of both species result from extremely notable appearance changes. The development of FACS tools for different species allows the objective and anatomical study of facial expressions in communicative and emotional contexts. Furthermore, a cross-species analysis of facial expressions can help to answer interesting questions, such as which emotions are uniquely human.[21]
The Emotional Facial Action Coding System (EMFACS)[22] and the Facial Action Coding System Affect Interpretation Dictionary (FACSAID)[23] consider only emotion-related facial actions. Examples of these are:
FACS coding is also used extensively in computer animation, with facial expressions being expressed as vector graphics of AUs.[24] FACS vectors are used as weights for blendshapes corresponding to each AU, with the resulting face mesh then being used to render the finished face.[25] Deep learning techniques can be used to determine the FACS vectors from face images obtained during motion capture acting or other performances.[26]
For clarification, the FACS is an index of facial expressions, but does not actually provide any bio-mechanical information about the degree of muscle activation. Though muscle activation is not part of the FACS, the main muscles involved in the facial expression have been added here.
There are other modifiers present in FACS codes for emotional expressions, such as "R" which represents an action that occurs on the right side of the face and "L" for actions which occur on the left. An action which is unilateral (occurs on only one side of the face) but has no specific side is indicated with a "U" and an action which is bilateral but has a stronger side is indicated with an "A" for asyasdasd asmmetric.
(a) Instantaneous plane-averaged temperature profiles. Thick lines show the profiles at the center of the velocity- averaging window; thin lines show the profiles at the start and end of the averaging window. (b) Plane and time-averaged horizontal velocity components, averaged over one inertial period.
Estimates of the friction velocity using several different methods at two locations in the mixed layer. Horizontal lines show the friction velocity observed in the simulations and 1σ of the time series. (left to right) Models are the balance method Eq. (24), the dissipation method Eq. (25), the profile method Eq. (23), the modified law-of-the-wall Eq. (26), and the modified profile method Eq. (30). Note that when the flow is unstratified, the modified law-of-the-wall and the modified profile method are identical to the profile method.
A stratified bottom Ekman layer over a nonsloping, rough surface is studied using a three-dimensional unsteady large eddy simulation to examine the effects of an outer layer stratification on the boundary layer structure. When the flow field is initialized with a linear temperature profile, a three-layer structure develops with a mixed layer near the wall separated from a uniformly stratified outer layer by a pycnocline. With the free-stream velocity fixed, the wall stress increases slightly with the imposed stratification, but the primary role of stratification is to limit the boundary layer height. Ekman transport is generally confined to the mixed layer, which leads to larger cross-stream velocities and a larger surface veering angle when the flow is stratified. The rate of turning in the mixed layer is nearly independent of stratification, so that when stratification is large and the boundary layer thickness is reduced, the rate of veering in the pycnocline becomes very large. In the pycnocline, the mean shear is larger than observed in an unstratified boundary layer, which is explained using a buoyancy length scale, u*/N(z). This length scale leads to an explicit buoyancy-related modification to the log law for the mean velocity profile. A new method for deducing the wall stress based on observed mean velocity and density profiles is proposed and shows significant improvement compared to the standard profile method. A streamwise jet is observed near the center of the pycnocline, and the shear at the top of the jet leads to local shear instabilities and enhanced mixing in that region, despite the fact that the Richardson number formed using the mean density and shear profiles is larger than unity.
3a8082e126