We examine the use of automatic adaptation to the user’s grasp and facial direction in interaction with a game. Two experimental studies were conducted. The first experiment identified patterns in grasp and facial direction that can be used as objective indicators of intentions and attention. The results indicate that participants grasp a remote control according to the intended use and turn their face towards the object with which they intend to interact. The amount of time during which the participants turned their faces towards the object was influenced by the available visual information. In the second experiment, we used the patterns identified in experiment 1 to create a game that adapted to grasp and facial direction. We compared two adaptive games to a purely command-based game. The results show that the participants in the adaptive versions of the game were significantly faster and made fewer errors but did not rate their feeling of control as higher, nor did they have a more positive affective experience.