Wednesday 12 October 2016

Understand theory and applications of 3D - Unit 66 & 67

Understand theory and applications of 3D
Start Date: 27/09/16
Tutor: James, Tedder.
Student: Wyatt, Chapman
Unit 66 & 67
--------------------------------------------------------------------------------------------------------------------------

3D Modelling.
The process of creating a biological being, or just a simple stool all starts with one shape, one object. With said object, you create another, and either weld, cut, duplicate, scale, rotate and more, to create an entirely new entity. As you begin to add more and more, the simple shape begins to take shape, forming into the ideal image you desire. 3D modelling has a wide array of creativity. Programs like Maya even allow you to animate your object, once you're satisfied with the results.

Many media types use 3D modelling; TV, games, films, products, websites, animations, and architectural designers. Each has a different purpose, and way of portraying its model.

3D Modelling Software:

- AutoDesk Maya - Maya has the most variety. It's used in films, TV, games, and architecture.

The software has been used for many cinematic movies, ones such as; Monster inc, The Matrix, Spider-Man, The Girl with the Dragon Tattoo, Avatar, Finding Nemo, Up, Hugo, Rango, and Frozen.

Users that are students, or teachers may get a free educational copy of Maya from the AutoDesk Education community. The version that are in the community are only licensed for non-commercial use only. Users will get a full 36 month license. When the license expires, users may go to the community to request a new 36 months license.

- AutoCAD - AutoCAD was first released in December 1982 for PC and later in 2010 for mobile. It's a commercial software application for 2D and 3D computer-aided design and drafting.

- 3D-Coat - 3D-Coat is a commercial digital sculpting software created by Pilgway. The software  was developed to create free-form organic and hard surfaced 3D models from scratch. It comes with tools that allow the users to sculpt, add polygons, create UV maps, and once complete, texture the resulting models with a natural painting tool and render static images.

- Houdini - Houdini is a 3D animation software developed by Side Effects Software in Toronto. Side Effects made Houdini from the PRISMS suite of procedural generation software tools.

Houdini has been in afew animation productions, which include films like Frozen, Zootopia, Rio, and Ant Bully.

For those of you that dont want to pay for ths software, Side Effects Software published a limited version called Houdini Apprentice, which is free for non-commercial use.

Gaming/Polygons - Round about everything to do with gaming in this present day, uses some form of 3D modelling software, many use the ones above but, the most common one to use is Maya. Although there is software for this type of stuff, game engines are also have their own 3D modelling tool kit, that comes with the software. Unreal Engine 4 has it's very own 3D modelling kit and animation set.


Making models for games is extremely time consuming, tedious, and requires a solid state of mind. There is also the problem with polygon budget. In a game you want the least amount of polygons you can possible have for the game, because it's rendering thousands of assets in real-time, it can take up alot of processing power. There isn't really a budget, more of a limitation to modelling for a game.

Every model made has sub divisions that are placed on the object. You can add more sub divisions/polycount, or subtract. Each one is editable, by selecting whether you want to select the "face" (Section of the sub division), edges, vertex etc. Adding more sub divisions/polycount means the higher toll it will have on your computer because it will have to process more.


As you can see in the image above, this character has more than 100 sub divisions/polygons, and each one has been edited near perfect to create this model. Creating just the head alone would have taken a good couple hours. The more polygons you have, the longer it takes for the system to render it.

It's important to know how many polygons you will have, so that the player can identify what the object is but, you don't want to add too many polygons, and slow down the rendering time. The best way to do this, is to look at an object silhouettes and notice what details are needed to distinguish it from other objects. The interior detail of the object can come afterwards, since you can make 2D textures to makeup for the limit of polygons. Of-course some interior details require it to be 3D but there shouldn't be enough to impact the rendering time.

Developers should almost always remove unseen polygons, since most the time, the player wont see the inside of an object, making it a waste of processing power.

It's best to make all your models in triangle polygons or quad polygons. Developers should do this because when you import your model into the game engine, or export the model from the 3D software, the polygons will triangulate for easier calculations when rendering.

By keeping your polygons as quads or tri (triangles) you minimise the chance of the system making your model more complex than needed.

Having a quad triangulate isn't such a large issue, since i can only triangulate a very limited amount of ways.


Once a polygon starts to have more than 4 sides, things get complicated. As we can see from the Pentagon below. It has more ways to triangulate.


Every vertex on the polygon is completely separate object. Since the quad only had 4 simple edges, there were only two ways it could triangulated but, a pentagon has five edges placed in unique locations. This means it has up to six ways to triangulate, meaning it will take longer to render in-game.


Lastly. LODs (Level Of Detail) for models. LOD models are great for rendering because when the model is seen from far away, the developer can reduce the detail used for a model, decreasing its process consumption.

LOD models are created by giving the object multiple models of different complexities depending on how far away the model is from the camera.

When creating a LOD model you must take into account that the object shouldn't look high quality. Instead you should go for the lowest quality possible, with the object still retaining it's resemblance to when seen up close. The model will be low quality when from afar, so you can have more models in one level without making the game choppy.

Film - Every film that uses 3D modelling/CGI will pre-render the model. By pre-rendering the model/scene, you are able to add almost and unlimited amount of polygons to a model. What this means is that the level of detail the designers can go into is unbelievable. This has a down-side to it though. It will increase the rendering time dramatically. It could take hours to render just one frame.


In "Monsters University" it took up to 29 hours to render a single frame. The amount of detail and quality could never be put into real-time because it would take hours, or minutes for you to just be able to move one leg, 3 inches.

From the image above we can see that everything was made in a 3D modelling program, each little detail, from the paper on the clipboard, to the screws on the door. Think. This is just one part sectioned off from the rest of the scene. The rest of the scene is also made entirely from 3D modelling. Could you imagine trying to render all of those polygons in real-time?

Animation - Before the 3D model can be animated it needs to have joints and control handles so the animators can pose the model. This is also known as "rigging".


Rigging, is when the riggers put a digital skeleton into a 3D model. Just as you would expect, the skeleton is made up of joints and "bones". The animators can use this skeleton to edit the pose of the 3D model. Simple poses can take hours, and creating animations/poses on par with high quality studios can take days or weeks.

There will be a root joint when setting up the skeleton. This will have every joint of the skeleton connected to it, either directly or indirectly. This is called "Joint Hierarchy".

There are two different rigging techniques:

Forward Kinematics (FK) - "one of two basic ways to calculate the joint movement of a fully rigged character. When using FK rigging, a any given joint can only affect parts of the skeleton that fall below it on the joint hierarchy."

Inverse Kinematics (IK) - "the reverse process from forward kinematics, and is often used as an efficient solution for rigging a character's arms and legs. With an IK rig, the terminating joint is directly placed by the animator, while the joints above it on the hierarchy are automatically interpolated by the software."

Rigging Info Source


Character animation is the process of giving life to a static model. Expressing thought, emotion and personality.


Creature animation is pretty much the same principles as character animation, only giving expression of thought, emotion, and personality to animals, fantasy creatures, or prehistoric animals.


Visual effect animation is, again, pretty much the same. Only the developer/animator animates rain, clouds, water, lightning and other natural effects but, they also animate non-natural effects.

3D Modelling used in Architectural Design - Alot of architects use different 3D modelling software, such as "SketchUp"  "Rhino3D" etc. for the earliest versions of their creations.When their concept is completed they tend to use tools like ArchiCAD or Revit to make building simulations. Some architects use CNC machines to print out a 3D model, but doing it this way requires far more precision from the designer.

Education - As of late, 3D modelling in schools is slowly becoming a main tool for teaching students. Not only does it give students enthusiasm for the lesson, but it can help them better understand the task at hand.


When students study medicine, they will often be presented with 3D models of the human body's anatomy created from software, like Maya. In doing this people can visualise, and perceive the human body better than if the teacher had supplied them with a 2D illustration. Teachers can even get an animated model. This way, students can observe how the body functions in greater detail.

File Size -  Polygon count as mentioned before, has a dramatic impact on how a game runs, but when exporting a file, or downloading one, a high polygon count also has a high impact on download speed, or the website you're uploading to might have a file size limit.

Fixing this is pretty simple in words. Just use larger and less polygons. That isn't quite right though. One of the most challenging tasks as a designer is getting a balance of detail with smooth surfaces, but also keeping a small file size.

Rendering Time - There are different types of rendering, like wireframe rendering, using polygon-based rendering, or more advanced types like scanline rendering, ray tracing, or radiosity. Rendering can be super time consuming and take up days or weeks, but it can also take up to a fraction of a second.

Real-Time Rendering - Real-time rendering is most commonly used in games or simulations, that are ran at from 20 frames to 120 frames per second (fps). Real-time rendering is the process of showing as much data as possible in one frame. Making it seem as photo-realistic at an acceptable amount of frames.

Non Real-Time - This is used in films or videos. Rendering in non real-time sacrifices processing time to gain a higher quality image. For films rendering just one frame can take days to render. Frame rates for a film are usually played at 30 frames, but can be played in 24 or 25 frames per second.

Radiosity - Radiosity is classed as a global illumination algorithm because the light arriving on the surface isn't just from the light sources, but also from other surfaces that are reflecting the light.

Including radiosity in rendering, adds realism to the finished scene, because of the way it copies real world illumination.


A regular direct illumination with lighting only placed outside of a window would light up a room, but it would be at its brightest only where the light hits the floor, while the surrounding area would only be slightly lit.

If you were to use radiosity however, the light outside the window would light up the floor, which would then reflect onto the walls so on and forth, but it would do it in a way so that depending on the material it could reflect the light to other areas, slightly or incredibly dimming the reflected light source to other areas.

Thursday 6 October 2016

Computer Game Engines - Unit 70 P1/P2

Start Date: 13/09/16
End Date: 10/10/16
Tutor: Wayne, Gallear
Student: Wyatt, Chapman
Unit 70
--------------------------------------------------------------------------------------------------------------------------
Computer Game Engines

About game engines
- A game engine is the incorporation of modelling, animation, terrain design, physics, sounds, scripting, AI and many other games development properties used. These properties will be used to create the game of choice by the developer.

Almost every games company have their own game engine and they all function in specific ways. Some game engines might have better physics, while another will be more visually appealing. Examples of this would be comparing a game like Borderlands 2, with it's cel shaded graphics. It prefers to have textures, such as blood splatters, to have a very colourful and vibrant look to them. Games with photo-realistic graphics, try to imitate real life, attempting to create a flawlessly beautiful landscapes, character details, model details, sounds etc.

Unity Engine - The Unity engine focuses on portable devices, such as android and IOS. It has however been used on different devices aswell, such as: Xbox, PlayStation, PC and Nintendo consoles.

Unity Release Date: June 8th 2005

In 2006 the engine won the Apple Design award at WWDC trade show. Unity said that this was the first time a game developer tool had ever been nominated an award.

The engine has specific support for each individual device. So the engine doesn't run one game on a mobile device better than if it was to be imported to a console. Unity's engine provides a shader with multiple variants and a declarative fallback specification. This allows the engine to choose the optimum video settings for your current hardware, to make sure you're getting the best experience from the game.

When giving an objects physics you must take into account how real life objects move, react, or don't react to certain events or environments. Luckily you wont have to go through too much trouble in doing this with unity. Since the engine has built-in components that handle the simulation for you. Meaning in just afew tweaks to the parameter settings you can create items/objects that act in a specific way (i.e they can fall, be collided with and wont move about on their own). The engine actually has two separate physics engines, one for 3D games and one for 2D.

Unity can import sound files such .AIFF, .WAV, .MP3 and .Ogg. Along with tracker modules in .xm, .mod, .it and .s3m. The engine supports mono, stereo and multichannel audio assets.

The engine comes with a program called "AudioGroup Inspector" that allows you to modify the sounds/music you have for your game. It has options like "edit in playmode" that allow you to play the game and modify the sound while playing. This can make things less tedious and easier to get the sound you want.

Unity's animation systems is based on animation clips, this systems has information which directs how an object should move; change position, rotate, along with afew other properties. The clips are then organised into a flowchart called "Animation Controller". Animation Controller keeps track of which clips should be playing, and when the animation should change.

Unreal Engines - The engine was made in 1998 and was first demonstrated in the game called "Unreal". Unreal's main genre started as First Person Shooter, but because the engine uses C++, its adaptability is quite high, so most developers in this present day use this engine on all platforms, such as: Xbox, IOS, Android, PlayStation, PC, Linux and afew others.

Unreal 4 Release Date: September 4th 2014


The newest version of the Unreal engine is "Unreal Engine 4".  The Unreal Engine 4 has been in development since 2003.

Some of the new features of this engine include; A new "Blueprint" visual scripting system, that allows faster development of game logic without using C++, and includes live debugging. Real time global illumination using voxel cone tracers, getting rid of the need for pre-computer lighting. Unfortunately this has been replaced with a similar, yet less computationally-expensive algorithm, just before release, because of performance problems with the next generation of consoles.

On September 3rd, 2014, Epic Games made the change from subscription based profiting, to creating a marketplace where people can buy models, textures, and other assets for different prices, and there are even some free textures and models that people can obtain. Along with the free content came multiple asset packs, that consisted of fully developed environments, landscapes, characters, props, sounds, animated meshes and more.

Unreal Engine 3 - Originally Unreal Engine 3 only supported the following platforms; Xbox, PlayStation and PC, while Android and IOS were added later in 2010. The first game to be released on consoles using unreal engine 3 was "Gears of War". The first PC release was "RoboBlitz".


The engines renderer supports techniques including HDRR, per-pixel lighting, and dynamic shadows. Many other features were added to Unreal Engine 3 in later updates, ones such as; global illumination solver, improved destructible environments, soft body dynamics, ipod touch functionality, steamworks integration and more.

FrostBite Engine - The FrostBite Engine was made by EA Digital Illusions CE. Made for the purpose of creating First Person Shooters (FPS), the FrostBite Engine gave birth to the BattleField series but, the company has expanded its engine to include a variety of genres, like; Role-Playing-Games/RPG, real-time strategy, racing and others. At this moment in time, FrostBite is an exclusive engine, only available after joining EA.

FrostBite 3 Release Date: October 29th 2013


The engine uses three main components, FrostED, Backend Services, and Runtime. FrostED is a PC program used for developers to create games in realtime workflow. Runtime memory and runtime performance provide the ability to enable code and data systems to release content to Xbox, PlayStation, Android and IOS.

The latest version of the FrostBite Engine is FrostBite 3. Workflows and runtimes are configurable, and cover all aspects of development, including, audio, animation, scripting, cinematics, Artificial Intelligence (AI), physics, object destruction, rendering, and visual effects. FrostBite 3 has added alot of features, like, a new weather system, physically based rendering (PBR), and support for other developing techniques; one example would be photogrammetry.


DICE first used the FrostBite engine in BattleField: Bad Company, BattleField 1943, and BattleField: Bad Company 2. There was a version of FrostBite called FrostBite 1.5, which was used to make Medal of Honor. The next generation of FrostBite 2. The first game to be released with this version, was BattleField 3. FrostBite 2 was the one that made other EA studios use it, other than DICE. FrostBite 2 powered games include: Need for Speed: The Run, Medal of Honor: Warfighter, and Army of Two: Devils Cartel. FrostBite 3 was released along side BattleField 4.

Rockstar Advanced Game Engine (RAGE) - RAGE is an engine made by the technology group at Rockstar San Diego with help from other Rockstar studios. The engine has had multiple appearances on a variety of consoles, such as; Xbox, PC, PlayStation, and Nintendo Wii.

Release Date: May 23rd 2006



RAGE had originally evolved from the Angel Game Engine, developed by Angel Studios for Midtown Madness and later the sixth generation console version of the Midnight Club Series and other Rockstar San Diego games. Rcokstar has imported some middleware components into RAGE, like the proprietary Euphoria character animation engine and the open-source Bullet physics engine. Before RAGE, Rockstar mainly used Criterion Game's RenderWare engine to develop games, including the Xbox, PlayStation, and PC versions of the Grand Theft Auto franchise.

CryEngine - CryEngine was made by the German developers from Cytek. It has been used in all games created by Crytek, the first being Far Cry. The engine has also been used in third-party games under the Crytek licence, including Sniper: Ghost Warrior 2, and SNOW. Several companies have different version of the engine, such as Cloud Imperium Games, which uses a modified version of the engine for Star Citizen and Squadron 42. Even a company like Ubisoft has an incredibly modified version that they have dubbed the Dunia Engine, which is used for the most recent version of the Far Cry series.


CryEngine 1 - This version of the engine was licensed to NCsoft for their game, Aion: The Tower of Eternity.

Crytek made this engine as a technology demo of Far Cry for Nvidia. When the company saw its potential, they turned it into a game. When graphics cards were released that supported 3.0 pixel vertex shaders, Crytek released version 1.2 of the engine, which used capabilities for better graphics. Soon after Crytek developed CryEngine 1.3. This added support for HDR lighting.

CryEngine 2 -  The version 2.0 appeared in Crytek's game, Crysis. It was later used in Crysis Warhead when it updated the same engine.

The engine was licensed to an architectural and urban-planning communication company. The obvious reason for this, was because the company would be able to see what the building looked like before construction. Other companies such as Simpson Studios also licensed the engine to be used for a game. On September, 17th, 2007, the engine became the first version in the world to be used for educational purposes.

CryEngine 3 - CryEngine 3 was released on October 14th, 2009 and is available for use on PlayStation, Xbox, and Wii U. The first game made using this engine version was Crysis 2, which was released on March 22nd, 2011.

On July 1st, 2011, an SDK version of CryEngine 3 was released specifically to make custom maps, mods and content for Crysis 2. Crytek also released a free-to-use version of the CryEngine for non-commercial use. It was released on August 17th, 2011 under the name CRYENGINE® Free SDK.

CryEngine 3.6/4 - Even though this 'version' is classed as 3.6/4, Crytek actually decided that they were going to rebrand the engine and simply name it CryEngine without the version numbers. They chose to do this because as they stated that the latest edition of CryEngine will not be like any others seen previously.

The new CryEngine version adds support for Linux and other consoles like the PlayStation 4, Xbox One, and Wii U. At certain events, Crytek have also shown their engine supporting virtual reality systems. At GDC 2015 Crytek brought a demo called 'Back To Dinosaur Island' to the event, which shown how well their engine could use VR.

CryEngine V -  On March 22nd, 2016, Crytek released a new version of CryEngine, called CryEngine V, which supports DirectX 12, Vulkan and Virtual Reality.