Unity Learning Expedition Round 2


Thursday May 3 we will begin Chapter 2 of our ULE series. If you missed out last time you can still join us–if you feel comfortable with the basics outlined in ULE #1 (previous thread). Again the benefit of joining is the ability to share w/teammates…questions, answers and achievements. I frequently try to remind myself: If I’m not doing…I’m not learning. Here you will be committed to post something each week…and #you.will.learn.

If you want to join…indicate that in this thread.

Here is the outline:
Week 1: Particles and Sprites
Week 2: Building interactive UI elements
Week 3: Characters that walk, run, jump…etc
Week 4: Characters continued… and navigation/pathfinding
Week 5: PlayMaker (or C# or Bolt) basics You decide how you want to code.
Week 6: PlayMaker (or C# or Bolt) basics

-In the coding sections we will be jointly building a library of canned scripts. We will scavenge to find re-usable code.
-We will be using Unity 2018.1

Leading up to May 3…As time permits… I may post some explorations


Looks very interesting again! Exited about Unity 2018 too.

But, march 3 is a date form the past. May 3 should work though. :wink:


Noted and corrected. May 3. :slight_smile: I think I’m so eager to learn that I subconsciously want it all…yesterday.

Unity 2018 is an amazing step up, particularly in visuals/rendering.


I’m trying to wrap my head around all the various work-flows you can employ with characters in Unity. Here’s how best I understand it.

Options to get a character into Unity

  1. Build a character and rig it in c4d. Import into Unity.
  2. Build the character in c4d. Rig w/Mixamo. Import into Unity. I tried this yesterday and it was quick and easy.
  3. Purchase a character (modify if desired)
  4. Use Daz, Poser or People in Motion for character and rigging
  5. Rig w/ native Unity tools or with an tool from the asset store (Final IK, Puppet3d, etc)

Options To animate the character:

  1. Animate the character in c4d and import that animation data into Unity, (together with the character or independently)
  2. Keyframe in Unity…likely w/a third party asset
  3. Pull in animation from Mixamo, Daz, Poser or PIM2.
  4. Import a mocap file from anywhere on the web into Unity (tons of freebies out there)
  5. Record your own mocap data and import (Using Kinect, iKenema, etc)

Unity’s character animation tool is called Mecanim. It’s pretty slick. There are also more powerful alternatives to Mecanim with Final IK and others


One week from today Learning Expedition II starts.

Unity 2018.1 is now at version 2018.0f1…which I believe is final release candidate…

There are tons of awesome new features. And getting started with those new features is made easy by templates. The templates provide visual settings as well as playback settings.

For instance there is a template for VR and one for higher end platforms that includes material options like SSS and topcoat. Rendering is pre-tuned for each template. More info here:
Unity Templates
Also the new Unity Hub makes it easy to have multiple versions installed and select which version you want to use.


Two quick things:

-Playmaker is on sale at the Unity Store…for those wanting to use Unity w/out coding

-If you are game to join up, Indicate your interest in the Expedition in this thread. Deadline: Wed night


I frankly wasn’t expecting we’d have anyone joining me for Unity LE #2.

Since there were a lot of folks tracking along and reading posts w/ the first round I’ll go ahead and commit to ride alone…and post my discoveries, and samples. I’ll try to post interesting links, visuals, videos and such as I learn.

Feel free to add your comments or questions.


Week 1: Particles and Sprites

In addition to working up examples of native Unity particles, I’ll explore this week leveraging X-Particles and using some c4d shape animations as image sequences, bringing those into Unity. Maybe also try utilizing TurbulencFD.


Just a word about ‘billboards’

Veterans are familiar with the concept, but anywhere in your 3d world you can have 2d imposters. This is a valuable method in c4d and even more critical in game engines, where you are trying to reduce computation.

A billboard or 2d imposter is simply a plane with an image, animation or video displayed on it. The reason I call it an ‘imposter’ is that sometimes you can successfully fool the viewer into thinking it’s a full-blown 3d element.

How to fool the viewer?
-Have the plane always facing at the camera so that it’s 2d nature is obscured. In c4d you use a tag to accomplish this, and in Unity this point-at-camera attribute is accomplished by attaching a script component.
-Distant objects such as a mountain range or far-away vegetation are great candidates for this trick, but by no means is the approach limited to distant items.
-Use of an alpha channel can definitely help merge the element and trick the viewer


Amongst other things, Unity can be viewed as a Flash (or Director) replacement.

In my career I’ve probably used Director…and later Flash…in over 30 commercial projects. Sadly, Adobe let Director die and though Flash still lives on, it does so with a different moniker. Adobe has changed the name of the product to “Adobe Animate”. It now generates HTML5. Obviously it isn’t as popular these days.

Unity can be viewed as a replacement for Flash or Director, with more power and potential.

One easily can use Unity to create any kind of interactive experience with 2d, 3d or a mix. That could include, for instance, a corporate presentation.


I’m on contract at a studio currently that uses Unity for some amazing corporate driven things. They take it to the next level though and impliment multi-touch technology and very, very large screens sometimes consisting of dozens of panels.

Cool stuff.


I’d love to see samples when you wrap up.


Yes you are right about that. Same history for me. These days, I’m using Unity for interactive presentation and not for games. In Flash and Director you was able to make a projector that could be installed as a standalone app, so there’s where Unity comes around to provide a standalone presentation in a similar way, but with all the Unity benefits. Too many options, so little time to explore them all…


Yup. :wink: We used to call them Self-executable files. Now we just say, ‘apps’.

I’ve found it interesting how many users on c4d forums share a common history:
-E-On Vue prior to c4d
-Director (macromind/macromedia/Adobe)…perhaps Flash… in their professional history


I was a Director (though not Flash) guy at one point as well. Loved ShockWave3D to pieces - worked really well with C4D export as well way back then - then Adobe came along and wrecked Director immediately after buying Macromedia.

Idiots. Idiots. Idiots. ShockWave3D could easily have evolved into something like Unity or UnrealEngine over time. All they needed to do is keep up with new DirectX/OpenGL capabilities and create an optional compiler for the interpreted Lingo scripting language so it executes faster. SW3D even had fast Havok physics back then in 2001 or so.

Today, Shockwave3D would probably have dominated Android and iOS games and been very strong online and in the cloud as well. Even multiplayer over internet was possible with ShockWave3D back then.

Aaarggh Adobe… just aaarggh… what a crap company it has turned into.

Btw, I didn’t come from Vue. I go back as far as VideoScape 3D. :smiley:


My first 3d program was InfiniD and then Bryce.

Time for me to start posting Unity samples today.

Edit: ok…I mean it. Some stuff will be up sometime before Wed morning.


I’m going to post an image and let you consider what it might be. Obviously it’s a map/channel designed to be combined with other channels. But what is it used for?

If no one can answer…I’ll answer Thursday.


I’ve used Particle systems for over a decade. Like many of you, I fell in love with them back when Trapcode blew up in the After Effects world.

This week I’ve begun to revisit particles in a new context, Unity. I’ve been surprised to learn new angles and possibilities:

-Take c4d/ae/xp effects into Unity as sprite sheets (or movies)…then employ in particle systems
-Build 100% native Unity particle systems

But there’s more:
Cross-polinate. I hope to take tricks I learn in Unity and employ those tricks with my particle work when using C4d/xP and in After Effects.

In later posts I’ll explain what a few of those tricks might be.


First…for those that don’t know…what’s a sprite sheet?

Consider a sequence of images in different paradigms…
-A layered photoshop document
-A Final Cut, Premiere or AE timeline where one ‘image’ is sequenced after another sequentially.
-A rendered sequence sitting in a folder in your hard drive

In the game world to optimize performance you can have a sequence of images all layed out on a grid in one image. That’s a sprite sheet and It looks like this:

Obviously this is quite useful in 2d games. But it can be used in lots of ways and it’s often used with particles.

I used c4d/X-Particles to create a probe. The object spins while the inner sphere has an animated texture and little flares emit from corners of the object. For convenience I will refer to it as, “EyeQ”.

I exported this into After Effects and Used a script called ‘Sheetah’ to generate a sprite sheet. Here’s a portion of the sprite sheet. (Yes there is alpha). My original intention is to have EyeQ display about 3x larger than what you see here.


So how did the “EyeQ” probe turn out in Unity?


Not bad, eh? No viewer would have any idea that it’s a series of flat graphics on a plane. The video is super compressed and dropbox video playback isn’t helping. Looks pretty clean in Unity.

I spent very little time creating the asset. Use your c4d skill…any skills with x-particles, mograph, turbulenceFD. It can play well in the Engine.

I animated EyeQ’s position up and then over and moved around to see it from different viewing angles, testing believability. Storage requirement for the graphic sequence was a mere 1.7 MB