Animating characters is an important topic in computer graphics and is also relevant in robotics. The high degree of freedom and complex interaction of the mesh and its bones makes generating even small animation cycles a complex and tedious task, especially if it is done by hand. While motion capturing is widely used for humanoid
characters arbitrary objects require a lot of handwork by animators. In this thesis, a pipeline is proposed to automate the animation task for arbitrary hand-drawn sketches. Parts of the sketches are classified based on their properties, for example, wings and legs, which are used together with a general task descriptor, for example walking to generate believable locomotion. This creates a straightforward way of creating a lively environment from one or
multiple sketches. Besides art, another application is that characters can be used in an educational environment by having a personalized guide for data exploration. The results will be evaluated with a user study by assessing the believability of the generated locomotion.