abstract: an indication of what it is that what I am going to investigating, searching into.
acknowledgments.
contest page: number to each chapter
introduction: more detailed
literature review
themed topic chapters
results
discussion or findings
conclusions
your publications
references
appendices
What is the goal of a research proposal?
The goal is to present the author’s plan fort the research you intend to conduct. Research proposal follow the same structure.
Determine how and why your research is relevant to the field.
Filling a gap in the existing body of research on their subject
underscoring existing research on their subject, and/or
Adding new, original knowledge to the academic community’s existing understanding of their subject
the research proposal also mush explain the following:
the research methodology you plan to use
the tools and procedures to use to collect, analyse and interpret the data you collect (survey, questionnaire, interviews to directors or peers, reviewing academic texts)
How long should a research proposal be?
Few pages long.A research proposal’s goal is to clearly outline exactly what your research will entail and accomplish, so including the proposal’s word count or page count is not as important as it is to ensure tax all the necessary elements and content are present.
Structure research proposal
All research proposal include the following sections:
Introduction where you introduce your topic; state the statement and questions your research aims to answer; and also provides context for your research. It should be concise, in some cases you need to include an abstract and or a table of contents before the introduction
Background significance where you explain why your research is necessary and how it relates to established research in your field. your work might complement existing research, strengthen it, or even challenge it. This is also the section where you clearly define the existing problems your research will address. By doing this, you are explaining why your work is necessary (“so what” question). You also outline how you will conduct your research. If necessary, note which related questions and issues you won’t be covering in your research.
Literature review you introduce all the sources you plan to use in your research including landmark studies and their data, books and scholarly articles. A literature review deletes into the collection of sources you chosen explains how you are using them in your research.
Research design, methods, and schedule including your research plan. The type of research you will do: are you conducting qualitative or quantitative research? are you collecting original data or working with data collected by other researches?. Is this an experimental, correlational or descriptive research. how you are using the collected data and which are they? or even the tools you will use to collect those data. there should also be included the research timeline, the research budget and any potential obstacles you foresee and your plan for handling them.
Supposition and implications you should be going into the project with a clear idea of how your work will contribute to your field. this section expresses exactly why your research is necessary. Also any ways your work can challenge existing theories and assumptions in your field; how your work will create the foundation for future research; the practical value your findings will provide to practitioners, educators and other academics in your field; the problems your work can potentially help to fix; policies that could be impacted by your findings; how your findings can be implemented in academia or other settings and how this will improve or otherwise transform these settings. This section is where you stay how your findings will be valuable.
Conclusion briefly summarises your research proposal and reinforces your research’s stated purpose
Bibliography list of sources and their authors
How to write your research proposal?
Formal objective tone. Being concise is a key components. Present your research proposal in a clear and logical way.
Editing and proofreading a research proposal is also an important step
When you are writing a research proposal avoid these:
being too wordy, be more economic with the words.
fail to cite relevant sources
focusing too much on minor issues; cover only the major, key issues you aim to tackle in your proposal
Failing to make a strong argument for you research; research proposal is a piece of persuasive writing which means that although you are presenting your proposal in an objective, academic way, the goal is to get the reader to say “yes” to your work.
polish your writing into a stellar proposal
Thesis ideas planning
Considering the developments I had in my two projects planning and my FMP ideas I had considered to create I came up with two possible areas of research for my Thesis:
2D an d 3D: Hybrid Animation:
Paperman workflow with meander animation‘Jing Hua’ watercolor animation style.
I am very fond of 2D animation however I do understand that 3D animation is more advanced and full of possibilities. What I liked about the Disney short “Paperman” is how it blends traditional animation and computer animation. They meander animation. Meander started development in 2010 as a stand-alone vector/raster hybrid animation system with the primary goal of bringing the power of digital tools to hand-drawn animation. Although originally targeting 2D cleanup animation, it was designed to be general enough for use throughout all departments in the studio. On the other hand ‘Jing Hua’ skilfully invokes the beauty of Chinese watercolour in the animation where they projected on 3D models watercolour brush strokes.
This two beautiful hybrid of hand-drawn and CG animations were very inspiring to me and made me reflect of adding them into the FMP short animation I intend to develop where the main character would “enter” into a Van Gogh painting (which is where I intend to use this type of animation). Regarding my thesis I though I could research into 2D and 3D animation and the techniques developed to blend them together.
Anthropomorphism In Character Animation
The second Thesis idea concept originated from the two project I am currently working on for the advanced 3D module in term 2: one project focus on a human character animation and the second on a cat character animation. So, I figured that I could carry a research on the differences between human and animals animation in films both from a movement point of view that from an emotion perspective. Moreover I could use the two animation I have created as a practical example.
The difference between formative abstraction and conceptual abstraction is that the former looks only at the elements and the latter starts to look at story structure or traditional canon and starts to break them down, and is not relying entirely on formal elements: it may use formal elements but it might start to break down film language.
Historically experimentation in these ares do not cancer mass media or commercial industry; festivals however are the place where these works are showcased most of the times. Futurists, Dadaist, Surrealist and Cubists left their mark of film and on animation. In the 1930′ to the 50′ wars, social and cultural arrests take place and concepts in films and art reflects the society at that time. Conceptual abstraction in experimental film reflects a personal vision and is usually an independent film (it may be commissioned). Independent production is harder to define in genre than the mainstream and by nature requires alternative strategies to appraise it.
The semiotics, the metaphor, the symbolism, the juxtaposition of traditional cannons are the elements that characterise conceptual abstraction. These films feature dialogue but not in a traditional sense since is a visual dialogue rather than a sound one. Non-dialogue films challenge the communicator to convey information through gesture and performance, filmic language and alternative audio components.
Jan Svankmajer was an animator pivotal in stop motion, time lapse. This film is an example of a non-dialogue film with dialogue not in a traditional sense. There are many metaphors and symbolism alluding to the human condition, how we communicate with each other and political issues as well: Three surreal depictions of failures of communication that occur on all levels of human society.
Max Hattler is a German CGI artist. This can be consider as an homage to the early experimental artist, during the industrial age, German expressionism.
This film is an example of how it can be possible to create a visual political comment through the use of abstraction. The audio is telling the audience the story. the whole feel of the piece is very engaging not a direct comment but the association is clear: Islamic patterns and American quilts and the colours and geometry of flags as an abstract field of reflection.
This is an experimental portrait of one of the most vertical cities in the world: Hong Kong. There is a strong sense of rhythm and patterns. This experimental animation renders the overwhelming feelings of missing horizon views within megalopoles subjected to densification, while using a photographic technique to reveal the continuity in the buildings’ facades.
Expanded cinema Expanded cinema is used to describe a film, video, multi-media performance or an immersive environment that pushes the boundaries of cinema and rejects the traditional one-way relationship between the audience and the screen. What Max hatter did for this piece put up some fountains that spray water and he projected the film on that cloud of water created by the fountain itself.
this is a chain of animators working together. This not necessarily an experimental animation, however it shows how even different animation can fit together starting from a basic action, in this case a red ball.
These following examples form Bálazs Simon showcase the “commercial side” of experimental animation: you are producing original work, you can still not work in the traditional cannon but still being employable within a studio.
Storyboards are commonly used in film, web and game design as a quick, effective way to communicate spatial position, sequence, motion and interaction to pre-visualise a scene.
Before starting the storyboard based on the draft of the script we have, I thought I could carry a research into storyboarding in VR since that storyboards are usually used to frame a scene but I am not used to storyboarding when there is no frame at all. However, is still possible to storyboard a scene, but instead of objects being defined relative to the frame (directing audience attention), objects are relative to the audience.
So instead of controlling what the audience sees in VR, probabilistic areas of user attention based on ergonomic data are used (modified the work to fit the player, not the other way around).
This video Mike Alger was very useful to comprehend better VR interactions. He mentions the field of view in a VR head-mounted-didplay (HMD)- the goggles you would wear when playing in VR- which is 94 degrees. I a seated position, a person head can rotate side 39 degrees and up 55 degrees (when standing or using a wireless HDM these numbers might be higher).
When considering distance, people tend to pay more attention when objects are closer: The minimum comfortable viewing distance in an HMD, before a user starts going cross-eyed, is 0.5-meters (Oculus now recommends a minimum distance of 0.75-meters). Beyond 10-meters the sense of 3D stereoscopic depth perception diminishes rapidly until it is almost unnoticeable beyond 20-meters. So the perfect distance would be between 0.5 meters to 10.0 meters to display the content.
when combining together these information above regarding the field of view and distance this would be the general diagram:
Adjusting it for a more realistic scale relative to the user, and project into 3D to allow for vertical annotation the diagram would look like this one (I have added our scenario where the player would be in front of the seagulls and being one seagull himself/herself). -This VR storyboard allows us to position objects relative to the audience in 3D space, while still being able to convey sequence, motion, and interaction.
This VR storyboard allows us to position objects relative to the audience in 3D space, while still being able to convey sequence, motion, and interaction.
how to storyboard transitions between environments: in VR they are going to use the teleport tool so the transition is going to be through a button.
Deformation of the screen to simulate the player view
Seaside environment storyboard
Sky storyboard
Pier storyboard
Research comedy genre in games (interactive comedy)
Comedy in video games
When using this genre in games, especially interactive ones, developers should not focus about “funny games” but they should be more interested in talking about comedy in games since humour is a very subjective topic. Comedy is a genre with form and structure and adheres to certain rules: it makes use of irony, contrast, suspense and poetic justice. The creation of comedy in games should be more objective allowing us to create more complete and diverse comedic experiences. A challenge that might be faced when creating comedy for video games is timing which is the key to successful delivery. But how can the game developers execute a well-timed joke when the player is in control of the moment by moment in progression of the game? The answer is take control away from the player. Much of the comedy takes place in cut-scenes, dialogue or context-sensitive action prompts, scripted moments. The player puts down the controller and enjoy a short comedy sketch layout for them. (developers remain in control of the timing in these way and the player still feels connected to the game). However in this way the player interaction is disrupted.
A rather innovative solution to the problem of comedic timing in an interactive medium is structuring the timing around the player and not the character – the real world and not the game world.Jokes that react to the actions and feelings of the player rather than the events taking place within the game fiction (a start button for instance).
“How do we create interactive comedy?”. How do we build comedy into the gameplay? How do we create comedy that could only be created within an interactive medium? What does interactivity fundamentally add to the tradition of storytelling?
The big gap between Films and Games is that, at least from the development phase point of view, the development team may try to predict the actions of the player and may even program responses for each possible eventuality, but it remains true that the game doesn’t “know” what you are doing. The game acts then as a “blind storyteller” in this way creating one of the essential dynamics in comedy: the straight-man.
For comedy-in-absurdity to work, there has to be an established baseline of normality that the agents of absurdity are deviating from. If absurd things are happening in an absurd world to absurd characters, they are no longer “unusual”, per the rules established by that world, and thus, the comedy is lost. A straight-man is required, a character acting as an audience surrogate, reacting to the absurd in the way that the audience would in that situation. It could also be the setting. An otherwise sane setting can play the foil to the absurdity of its inhabitants. The Straight-man is often portrayed as reasonable and serious, while the other one is portrayed as funny, less educated or less intelligent, silly, or unorthodox When the audience identifies primarily with one character, the other is often referred to as a comic foil. The blind storyteller as straight-man dynamic allows for magnificent reality-breaking. The player, the only agent in this dynamic cognizant of the rules that govern reality outside of the game, can exploit the game’s artifice to produce moments of hilarity played, by the game, entirely straight-faced.
What Makes Games Funny?
Games are, by nature, comedic even extremely serious dramas can become comedic by allowing the player to do things that go against their style, therefore there are different scenarios for comedy to develop in interactive games:
Finding humour where the developers did not intend it – any effect the player can have on the world can let them do or create something humorous; (unintentional, inherent comedy)
Humour through story and narrative – one way to include humour is via the story; however not make everything into a joke since that comedy is based on contrast and when everything is funny nothing stands out.
Chaos an mayhem are often hilarious – comedy from chaos is about the unforeseen: it can appear unplanned and they offer a possibility of abrupt explosive failure.(fail must be fair although, sudden death can be frustrating and no fun).
Humour through gameplay – it relies on unexpected and unique interactive gameplay elements that serve to illustrate the joke (compared to spoken lines, which are non-interactive): a vital point is that you have to actively perform something and get a reaction. Another expedition could be to re-purpose regular game actions for other actions makes it funny, and having characters remarking upon them as the wrong actions.
There are a lot of straightforward ways to make a game more comedic. These elements stated above can be used to engage the player with comedic elements, instead of just having funny things appear. Actually while we where in the meeting we thought of some interactive elements for the player to have in the meantime of the game since we considered the active role of the player essential for the overall experience.
Seagull, props models (assets)
Seagull 1 model adjustments
In our second meetings I received some feedbacks on the model of the seagull I had made: overall was good, however they suggested me to create a wing that would resemble a human hand with fingers, since that in the game there will be some actions involving the seagulls grabbing objects.
In order to use the wings as hands, the feathers should have at list a feather looking thumb but not necessarily five fingers/feathers: the bones of the wing are too specialised for flight to become a hand, but feathers feathers could be used as fingers -this would make look the wing still looking like a wing but having hand-like functions. I thought I could use the primary wing feathers touse as fingers. Moreover when grasping something or gesturing the wing would look like a humanised wing, but still looking like a bird wing when flying. I was inspired by animations where bird characters can bend their wings forward like arms, something that is physically impossible in real life.
This is the new design I have created and that I will use as a reference when modeling it.
New design:
rendered model
I was not happy with how the right paw turned out so I duplicated the left one and merged it to the other leg to have them look the same
I sew each vertex together to match the new paw with the leg
Seagull 2 model
Just like I did fro the other model I have used the model sheet to use as a starting point to model the second seagull
Once I have aligned the poly to the image both from a side and front view I have separated the two legs creating new edge loops and deleting the faces I did not need
I worked on the details of the “feathers”/skin at the root of the leg using the vertex of the model
I have created the eyes extruding and then deleting two faces at the side of its head:
From a sphere I have created a pupil and using another sphere using the top faces as the upper lids and the bottom faces as the bottom lids. I have also assigned to the pupil a white blin texture and a black one for the pupil to make them reflect light as eyes would.
I after worked on the paws creating some edge loops and moving them to make them resemble the seagull one.
Final rendered picture of the second seagull
I after created a scene using an infinite background and adding some are lights with yellow, pink and blue tints to see how the two seagulls looked together.
Props
Other than modeling the objects present in the script I thought we may use some props I previously modeled and textured.
List of props that need to be modeled according to the first part of the script: fries and single fries, ice scream, sand shovel.
Rigging
I have used this video to use as a basic process to start and rig the seagulls model which helped me to understand better how to create joints for the rig, how to combine them together, how to bind the geometry with the joints and how to create IK handles for the wings.
Seagull 1 Rig:
I have first created the joints according to the model, birds bone structure and the actions useful for the animation in the game.
For the wings joint I have created one first and after mirrored it to the other side.
basic rigthe body and wings joints parented together
I after unified the geometry and the rig together to make the model move along the rig using the bind skin tool
I have used this tool to create IK movements for the In animation, a tool for posing joint chains. An IK handle resembles a wire running through a joint chain, and enables you to pose the entire joint chain in one action. As you pose and animate the joint chain, its IK handle automatically figures out how to rotate all the joints in the joint chain using an IK solver.
I froze the transformation for the controls to help the animation stage when moving the controls and having a clear start point.
Once I started exploring the model movement range I have noticed that the rig I had previously created for the hands/feathers was not working at all so I had to start again creating more joint this time to make the fingers curl.
Leg problem rig IK handle to bend:
Skin weight for the leg where I painted the area that I wanted to be affected by the joint movement.
I have added some controls to use for the fingers joints: since the joint transformation could not be frozen I created some nurbs circles to better identify the joint and most of all to animate them after since after I parent them to the joint I was able to froze their transformation so that when animating their position could be easily turned to zero.
I adjusted the rig by unbinding it first from the mesh and working only on the rig.
I added an IK handle for the knees and some costumed controls as well.
I separated the eyes from the main mesh so that it would be easier to control their rotation. As for the eye lids I created two joints to be attached to them and allow me to move them to animate an eye blink, one for the upper lid and one for the low lid.
I have adjusted the hierarchy of the rig by putting everything “ruled” by the joint on the hips: before the joint in the head was at the top, however the balance of a body would be the root of a character so I made sure that every joint of the rig was in the hips joint. I have downloaded a simple rig to better understand the hierarchy.
I also created the controls for the eyes and the eye lids
the arrows outside are for pulling down the upper lid and pulling up the down lid; there is also a locator to move the pupil
Moreover I added an attributed to the fingers/feathers using the set driven key tool to make them curl when is set to 1
This week I had an induction on how to use the VR oculus equipment. We thought that since it may help to visualise the objects in the scene, learning how to use some apps for VR might be useful. We tried in particular tilt brush, a room-scale 3D-painting virtual-reality application and explored its potentiality. We figured that realising a storyboard in VR other that a more “traditional” one can help plan the project better as well.
Abstract art has been a concept that many artist explored since they started to question anything about space, colour or form and as soon as the potential to manipulate images was available, artists transformed their ideas into movements often they used this experimentation to go against the most traditional art forms. Artists in the early 1900’s from the avant grade movement worked with line, form, movement and rhythm as well as colour and light.
Animation is a technical medium and experimental works such as these remain central to the development of Animation itself since they are fuelled by technological advancements which will always happen and continue to motivate independent and ground-breaking work.
There are different aspect to explore when talking about and identify experimental films such as the approaches used, the concepts and models employed so it might be hard to categorise and define them: we need to understand which is the historical context where the experimental film is developed as well as determine the individual motivations and priorities of the artists (if it is commercial is not necessarily experimental).
An experimental film de-constructs the traditional cannons of films, so the term abstraction become very important. Abstraction does not aim to depict an object but composed with the focus on internal structure and form, is usually emotionally detached or distanced form something and also does not relate to concrete objects but expresses something that can only be appreciated intellectually. There are a number of abreaction to explore, for example there is a formative abstraction. A formative abstraction considers the formal aspects of film, image and manipulates its fundamentals such as line, colour, light, form space, texture, sound, dynamic, movement, sound. It usually combines two or every one of those aspects. The artist’s involvement is essentially investigative and may not have a predetermined outcome but must be grounded in the intellectual pursuit of applying a theory or initial objective.
There are some elements that may help when analysing and implementing Formal Experimental Animation:
Categorisation– genre and sub-genre, what is the work background, settings, mood, theme or topic, how does it comment? Does it fit or is it unique?
Form and Function- interpreting its meaning and relating it to the format, or presentational mode such as “what are the artist objectives, limitations…”
Process– the techniques, materials and technologies applied within the work and the relationships between message and medium, (Does process, technique or tool become the message?)
Formal Elements– use of space, composition, light & colour, movement, rhythm, timing, pacing, transition and audio relationships.( does his work investigate these or other formal elements?)
Is there a film that you know that classify as experimental animation?
Following the screenings consider an animated work you feel represents Formative Abstraction that meets the above criteria and provide a short explanation of how this is evidenced in the work.
Author/Artist Paul Jeffrey Sharits was a visual artist, best known for his work in experimental, or avant-garde filmmaking, particularly what became known as the structural film movement, an experimental film movement prominent in the United States in the 1960s.
“Dots 1 & 2”
Categorisation– An experimental film featuring a hypnotic illusion using two black-and-white sets of dots.
Form and Function- “Dots 1 & 2” relies entirely on a single gimmick used to create a fairly basic exercise, and a flicker effects, dabble in total abstraction. The imagery in this approximately forty-second exercise indeed relies its entire illusion on optical effects, yet in a more compelling and original way, this seems to be intended as more hypnotic.
Process and Formal Elements– “Dots 1 & 2” is titled as such because the short features two different sets of dots: white dots on a black background, and black dots on a white background. Both sets are combined in one of those complicated optical illusions (it is real since it was made on filmstock) as they merge and grow bigger, overlapping in a never-ending loop.
Some examples of Formative Abstraction
Norman McLaren explains how he makes synthetic sound on film. With an oscilloscope he first demonstrates what familiar sounds look like on the screen; next, how sound shapes up on a film’s sound track; and then what synthetic sounds sound like when drawn directly on film.
Norman McLaren worked primarily on sound and image and worked directly in film. He used in this specific film bipacking. In cinematography, bipacking, or a bipack, is the process of loading two reels of film into a camera, so that they both pass through the camera gate together. Boogie Doodle is an artistic collaboration between renowned pianist Albert Ammons and animator Norman McLaren: they made the image moving according to the sound trying to express what the sound looks like. There is no predefined meaning anyone can apply their own sense to it.
Hans Richter was a painter in this experimental animation is looking at space a surface of a canvas, he explore the connection between screen and depth and how you can create illustrations through depth of the screen. Pioneering Dada work, Filmstudie was an early attempt to combine Dadaist aesthetics and abstraction. Made in 1926 Richter’s film presents the viewer with a disorientating collage of uncanny false eyeballs, distorted faces and abstract forms (none of these themes is treated constantly). It’s similar to Man Ray’s work in its ballet of motion which combines a playful tension between figurative and abstract forms, both in negative and positive exposure. FILMSTUDIE is essentially a transitional work of mixed styles. A number of devices drawing attention to the technical specificity of photography (multiple exposures and negative images) are also included and enter into a successful fusion with the remaining elements.
This experimental film is composed from squares, rectangles and other straight-edged forms animated in overlapping, kinetic compositions. The shapes in this film are not solid colors, but graduated tones, and the development of each sequence is built around asymmetrical compositions that break the frame into harmonious sections. The result is dynamic, active: the moving shapes suggest the rapid movement of machinery, pistons. Then in the middle of the film there is a shift towards a bifurcation of the frame and oscillating patterns that rotate around this central axis, before a return to the asymmetry of the machine-like motions.
This experimental film explored sound in image. An optical Poem is an abstract piece of stop-motion history, was made in 1938 by German-born Oskar Fischinger, an avant-garde animator, filmmaker and painter following the music is Franz Liszt’s Hungarian Rhapsody No. 2. Oskar Fischinger’s work is all about dancing geometric shapes and abstract forms spinning around a flat featureless background. Circles pop, sway and dart across the screen, all in time to Franz Liszt’s 2nd Hungarian Rhapsody. This is, of course, well before the days of digital. While it might be relatively simple to manipulate a shape in a computer, Fischinger’s technique was decidedly more low tech. Using bits of paper and fishing line, he individually photographed each frame, somehow doing it all in sync with Liszt’s composition.
Len lye was fundamental in the way we look at animation now. In Kaleidoscope Lye animated stencilled cigarette shapes and is said to have experimented with cutting out some of the shapes so that the light of the projector hit the screen directly. He developed a number of stencils such as a yin-yang, a diamond shape, a wheel, a star to complement his hand-painted images. The way these shapes spun and rolled across the screen anticipated the movements of his later kinetic sculptures. Inspired by the primitive imagery of South Sea island art and film’s power to present dance ritual and music, Lye’s experimental – and often revolutionary – camera-less techniques attracted the attention of John Grierson and Alberto Cavalcanti of the General Post Office Film Unit in London, which sponsored Colour Box and other films. This advert can be considered as an example of how experimental animations can also be commercial. 3 techniques stencilling (putting things on the film and after he painted it sung sprays, inks on those stencils), paint or ink onto the film and scratching and scraping (putting ink on the film and after scraped it).
A progression of this work, where he progressed these techniques with live action is this experimental film sponsored by imperial airways:
In this film he filmed live action film and after he stencilled on top of that and he was commissioned by the GPO (general post office) film unit. The GPO Film Unit was a subdivision of the UK General Post Office. The unit was established in 1933, taking on responsibilities of the Empire Marketing Board Film Unit. Headed by John Grierson, it was set up to produce sponsored documentary films mainly related to the activities of the GPO.
Here is another ‘drawn on film’ abstract animation British film short. “Trade Tattoo” is a promotional short made by Len Lye in 1937 for the GPO. (‘General Post Office’). The film utilises live action footage, composited so that it blends in and out of Lye’s abstract animation.
Len Lyn was a very important practitioner in experimental film, he was head of the industry at that time
Stan Brakhage was one of the classic experimental animators since he was never sponsored by anyone and he never attempted to please an audience. He produced a series of film which explored elements of nature onto the film: he would place plants on a film and after another film on top of them and he would after print everything together.
A “found foliage” film composed of insects, leaves, and other detritus sandwiched between two strips of perforated tape.
He never used sound in his films because it emplaces a narrative on the image: by taking the sound off you can take the formal aspects of the images right to the viewers. However he did incorporated music on this film:
In this case also music is experimental, is very difficult to watch, discord, disharmony and this is the very intention of the artist.
John Hales Whitney, Sr. was an American animator, composer and inventor, widely considered to be one of the fathers of computer animation. He studied painting, and travelled in England before World War II. James completed seven short films over four decades and collaborated with his brother John for some of his film work. James Whitney’s LAPIS (1966) is a classic work of abstract cinema, a 10-minute animation that took three years to create using primitive computer equipment. In this piece smaller circles oscillate in and out in an array of colors resembling a kaleidoscope while being accompanied with Indian sitar music.He basically used an analog computer developed by his brother John – based on a gear-driven WWII surplus ballistics computer – to move many layers of hand-painted cels, frame-by-frame.
Everyone of these pieces is sensory is not about a narrative or characters. When watching an experimental film us as audience have to engage, figuring out what it is about bringing our interpretation into it, making us aware of something. However there must be a good reason behind the idea of the artist for experimental films, having intellectual reasons to do it, and references to.
Sound and image are very related and are major areas for experimentation and allow us to discuss which role sound has and the impact of it on image.
Unity is a cross-platform game engine. You would not normally model within this environment, you can animate using it, however maya is more specialised for it. This workshop was useful to understand the assets management which would be useful also for the Collaborative project since we are going to use models and animation from maya and export them into unity for VR.
I first created a new project
Game tab
Play Mode is one of Unity’s core features. It allows you to run your project directly inside the Editor, via the Play button in the Toolbar.
only one camera in the scene (turning off the main camera)
How to import animated assets into Unity from Maya:
Export the selected model as an fbx files which can contain data such as animations, 3D mashes and materials.
Drag and drop them into the assets folder: in the inspector tab the animation ,if there is any, can be played.
To play it in Game mode – drag and drop the animation property on the object in the hierarchy tab
Model properties in UnityAnimator tab to edit animationsAnimation tab to play the animation on a timeline
When double clicking the animation in the animator tab the window on the left would appear where you can loop the animation and after clicking apply Loop animation are useful in games for continuous action repeated in the background; when clicking on the animation in the animator tab just once the window on the right would appear where you can speed up or slow down or reverse your animation
Create an empty game object and drag and drop the model imported under it: in this way you can scale the model by scaling the game object, the parent, and not affecting the animation properties.
When you have multiple animation properties for the same model if dragging and dropping them on the same object in the outliner they can be played in a sequence.
In unity also complex model an animation can be added with the same process.
An important aspects to take in consideration is that film, animations, follows the director vision so assets are framed and audiences are led to watch them through that frame (objects are being defined relative to the frame (directing audience attention). In games, however, objects are relative to the audience.
This week we looked into combining the MoCap data we collected in week 1 with Human IK rigs on Maya.
In a new Maya scene I imported the first take of the MoCap data which will bring in the skeleton with the movements and actions of the animation recorded in the studio with the mobcap suit that the person wore
In the time editor I have imported the animation of the motion capture as a clip by selecting the hips of the rig. With the time editor you can easily import different motion capture animation clips and combine them together.
Before adding the second clip I had to identify the skeleton in maya as an human IK to attach the rig to the character.
I have created a character definition where I have put every correspondent joint in each part of the rig of the character in the viewport.
Once the pairing is done and each joint turned to green, I have added one rig. If I want it to be bound to the skeleton you need to put the rig we matched and identified before in Maya before as a source for the character in order for it to follow the animation.
I after went back to the time editor to further edit the animation: you can trim the clip, make it move faster or slower just like in a video editor.
You can even add a second animation clip to merge them together as long as they are applied to the same MoCap skeleton we have imported.
In order to match the two animations you can align the animations form the clips using the “matching options” in the Relocator menu. By selecting the initial mobcap skeleton, I used the location of the right foot to be matched with the location of the right foot of the second clip matching the last moment of the location of the first clip to the first moment of the location of the second clip.
This is the outcome:
By grabbing on elf the clips and dragging it on top of the other clip it would merge them seemingly making the transition smoother.
After I have created a new character definition with the same character but with no controls doing the same thing I did with the mobcap skeleton but applying it to a new rig.
the definition of the character works better when there is a typos, so I have aligned the arms of the character.
I after matched the rig of the character rig just like I did for the mobcap rig. I also created a control rig for the IK and if I added the character 1 rig as a source the mobcap animation is going to be applied to this new rig.
I after baked the motion caption onto the rig of the character so that I am not bound anymore to the character 1 skeleton and I was able after to edit the animation from the mobcap in a different layer to adjust it. Baking is a term that is used widely in the 3D community. It is a term that can be applied to many different processes. What it generally means is, freezing and recording the result of a computer process. It is used in everything from animations, to simulations, to texturing 3d models and much more. Baking a simulation allows you to generate a single animation curve for an object whose actions are being provided by simulation rather than by keys and animation curves (keysets).
The controls from the baked animation are now in a new animation layer to be edited to start clean.
On the other hand, in order to identify a rig that is more complicated the process to add the mobcap animation onto it (adding finger animation for instance or facial expressions). In particular this rig is not identified in Maya as Human IK. As for the previous rig I first identified it positioning it as a typos aligning the arms.
I after created a new character definition selecting the rig and identifying the skeleton.
I then created a custom rig mapping selecting the controls and applying after the animation onto it as well, mapping for your character to source animation streaming from MotionBuilder or from a local HumanIK character within Maya
During this weekly meeting we sort out many aspects about our project:
The Animation dynamics: We are going to have specific action for the seagulls, however when they are not doing any specific actions they would still be making some sort of saucing (just like characters in game do)
The script and the game should be 5 minutes long
Players would play the game as if they were birds: with wings instead of hands and paws instead of legs; these are not going to be animated since the player to move in the scene would use a “teleport” that would bring him or her directly to the next environment.
How the models scaling and animation in Maya are adaptable to Unity environment: for animations we are going to have single files where the seagulls are going to do simple actions (e.g. jump, side to side) and after we are going to import them individually in Maya; regarding the scaling, it won’t be a problem if the models are too big since through Unity is going to be easy to adjust them in the right size.
We finalised the first version of the script and comment it altogether so we could start the storyboard.
I have showed the model of the first seagulls that I have created and they suggested me some changes I could apply such as the wings looking more like a human hand with feathers functioning as fingers since the seagulls are going to grab objects in the story.
Tilt Brush: Tilt Brush is a room-scale 3D-painting virtual-reality application which we thought we may use to create our storyboards directly in the VR space – I have also tried it out since one the team member brought along the VR equipment to have an idea on how it works.
We also established the next tasks to work on until next meeting:
I am going to be focusing on the storyboard based on the script, doing a research in comedy genre in games according to the project brief, since it might be useful for the overall tone of the VR game. I am also going to make the adjustment and finalise the first seagull model and create the second seagull model as well as some props models also according to the script. After the seagulls models are done I will try to rig them.
Since we started working on the script for some sketch ideas for the Seagulls interactions I thought it might be a good idea to create some designs where this comedy sketches are set in order to make it easier to model the environment props and scenes firstly which may help the Previs stage. The first design is based upon a dialogue the two seagulls have in the rear part of a restaurant where they have a dinner just like it happens in the Disney movie “Lady and the Tramp”: we thought that it might be a funny situation and it would fit perfectly with the comedy genre on which this project is an adaptation of. The design inspired by Brighton’s pier is also an idea for a setting; we thought that there could be an interactive moment with the person playing in the virtual reality environment where one of the seagulls would steal an ice-cream from him/her and the second seagull would apologise for him saying that when he is hungry he becomes cranky (Brighton seaside is a well known spot where seagulls tend to steal food from tourist and locals).
Story Script
Idea Generation for VR Collaborative
Environment changes to city
*At the bin outside Italian restaurant, there is a table filled with oddities and piece of trash such as pizza crush and crushed coke cans *
*The two Seagulls get to the rear part of the restaurant where they find an Italian set dining table *
Seagull two “Don’t say I never think about you”
Seagull one “how you set up the table?”
Seagull two “A couple of raccoons owed me a favour”
*Lady and the tramp situation-like where they end up eating the same spaghetti noodle and there is an embarrassing moment*
. Seagulls mistaking modern art for trash.
Seagull one “I’ve never seen garbage set up so well”
Seagull two “But this is the latest installation by one of the most prominent artists of the moment”
Seagull one “It still looks like garbage to me”
Seagulls aiming on humans with a point system
Seagull one “Last year I almost got the second place in the Seagull Shooter Championship I would have arrived first, but a sudden gust of wind saved my last victim”
Seagull two “Don’t be discouraged, this year there will be more calm wind”
These sketches are some ideas we could use to set our narrative and create funny moments for the player to witness: these sketches are based on the comedy created by Joshua Barkman in his comics: we took cliches of seagulls (seagulls stealing food, seagulls eating garbage, seagulls pooping on people) and turn them into comedy sketches. As you can see from the script above we first describe the context in which the sketch is going to take place and after there is the dialogue , which will after be the speech bubble displayed in the VR environment.
Animation Testing with reference videos
For the animation, since it would not be displayed as a normal 3D animation (no videos start to end) but in a virtual environment and there would not be no voice over nor lip-sync, the way we plan it varies from what we are normally used to and we should adapt the animation to our needs. In our first meeting we decided the way the sketch are going to be represented and we are going to use speech bubbles just like it happens in comics, so that it would be already clear that we are referring to a comedy sketch, The seagulls would move similarly to how they move in real life in a “jerky” way.
With an a rig that we found online I tried to animate it trying to recreate some movement that the seagulls themselves would do in the VR; in this case I did a walk cycle followed by the seagull looking directly to the camera, which would be the player, from very close.
Characters Models
In order to animate the seagulls model in Maya I have created a character model sheet to have a reference for the character. Here follow few screenshots from the modeling process in Maya:
I first used a simple polygon and tried to make it match with the model sheet I hd previously drawn which depicted seagull 1 first in profile view and also front view with his wing spread out.
After having created the main body using the side view of the character, I after used the front view reference image and adjusted the model to it adding the legs too; I also started model the wing using another polygon shape following again the reference image.
I also extruded the faces at the end of the legs to create the paws using the study I carried on seagulls: they have webbed feet with three fingers pointing out at the edges.
Using a diamond shape I used the vertices to create the beak of the seagull: I have separated the top from the bottom in order to be able to animate the open and closed mouth after.
I after created the eyes socket to position two spheres (one small and one bigger) to use as the Seagull’s eyes: I assigned them a blin material and after I selected few faces of the sphere and make them darker to simulate the eye pupils.
I finally assigned to the different body parts a colour and rendered a picture in Maya using a directional light to show it to my team members.
The Camera Sequencer gives you tools to layout and manage camera shots, then produce rendered movie footage of the animation in your scene. You can start to layout shots in Maya, using different cameras angles and movements and in the camera sequencer window editor chance order of the shots and even export them together using an ubercam which will function as one camera incorporating all the cameras you have worked with so far. The editor resembles the one of After effects or any video editing software from the Adobe suite. Even for large scenes, you can produce movie clips to achieve real-time playback.
In the camera sequencer editor you can create the shots, assign them a camera and decide which frames to include in the shot too.
After I have experimented with the scene provided to us by the teacher, I came up with a very simple idea of representing a crowd marching in the mountains and created some cinematic shots to explore the tool in Maya.
The march
Overall I think that this is a very interesting tool that might come in handy when producing the Previs for our project directly in Maya, especially for visualise the transitions there are going to be between one environment and the next within the narration in the Virtual Reality space.