Embodied conversational agents (ECAs) provide one possible way of incorporating nonverbal portion of speech into voice based user interfaces. Part of agent's visual behavior and its appearance is often solved "statically." There is another possibility to change appearance - dynamically (for example head or eye movements, mouth opening, etc.). Our hypothesis is that some "dynamic" parameters are more important for the user than the other ones. In this paper we present the pilot user study... Read more