Today we finally get the chance to talk to Mikael Burman who is Technical Supervisor with the Cinematics Department at Massive an Ubisoft Studio in Sweden.
http://www.massive.se. He was a member of the team that worked on Far Cry 3. It has taken some time to get this busy guy to answer our questions.
As always we start with some personal questions.
I guess all you guys had fun at the FarCry 3 release party.
Haha... well, I actually didn't attend it. :D I'm not so much for parties to be honest and I like to take any opportunity I can to spend time at home, writing music or doodling 3D. :) Yes, I am a lonewolf... :) But I'm sure that those who went to the party had a great time! :)
As an introduction Mikael, please tell us a little about yourself.
I’m 35 years old. I grew up in Sala, a small town in the middle of Sweden. As with most small towns, there isn’t a lot to do, so I found myself playing drums in a death/thrashmetal band and also developed an interest in computers. In the year of 2000 I managed to make my first piece of animation using a mix of Imagine and LightWave on the Amiga.
Fast-forward a couple of years and I had got myself a Windows PC, running LightWave version 7 and was merely a hobbyist and did some VFX-work/commercial stuff, all for free, of course. Those weren't any high-end things at all, but getting into projects was good, because I learned a whole lot by doing it. And there is a certain motivation behind doing real projects compared to just noodling around by yourself.
"And there is a certain motivation behind doing real projects compared to just noodling around by yourself."
In 2004 I managed to get a reel done, and got accepted to School of Future Entertainment (SOFE), and through that I learned Maya and got an internship at Fido Film, located in Stockholm. I was there (Fido Film) for almost 6 months as an intern and then got back to them as a freelancer, after graduating from SOFE in spring 2006. At this point I also had contact with the Cinematics Director at Massive Entertainment, who were looking to get more talent into the team... at the same time, Fido wanted to contract me to do more work for them, so I had some rough time deciding what I wanted to do... Eventually I decided to go to Massive Entertainment.
Apart from work, I picked up a Roland electronic drumkit a couple of years ago and I’m in the process of writing my own kind of rythmic metal/rock music and learning more about producing music. I also enjoy playing computer games, watch movies and TV-series and I tend to end up noodling around with 3D a lot in my spare time as well.
How long are you with Massive Entertainment?
I joined Massive 1st of September 2006... so... 6 years and 3 months, roughly.
Now we get to Massive Entertainment and for sure we want to hear some company secrets.
Tell us a little about Massive Entertainment? What is it like to work there?
Massive is located in Malmö, a large city in the south of Sweden. When I joined there were around 40 talents in total, today we are around 250; a mix of coders, graphics artists, producers, directors and managers.
Malmö is a fantastic city to live in. One of the things I noticed when moving down here from Stockholm was the much calmer atmosphere in the city; the pacing of everything feels less hectic, and generally, people smile a lot more down here. Since I grew up in a small town, Malmö is a lot more like that compared to Stockholm. It fits me quite well.
Working in the Cinematics team has been a whole load of fun, and, occasionally, a lot of hard work as well. Not that the work per see is hard, but there is sometimes a whole lot to do in a relatively short period of time, and since we are a very small team, it is sometimes comparable to climb a very large mountain, or, running a marathon, with the exception that you need to do it in a given timeframe. That can be tough depending on the project. And that is another factor that makes the Cinematics team so fun to work with; there are a wide vareity of projects... everything from logos to highres cinematics to realtime in-engine cinematics.
How large is your team and what are you working on?
The Cinematics team is a small team... we recently staffed up to make sure we can deliver a project in time, so we went from 11 people to around 16. The reason that we can stay relatively small for these kinds of projects is because of the multitasking abilities of each artist in combination with a multi-application pipeline. It does mean that each of us have to work harder and more (especially with larger projects), but at the same time, it offers each individual a bigger responsibility which, in turn, makes it more fun for the artists. This also means that we are a very flexible team; we can grow when needed, or outsource certain tasks if the talent we need is not available in-house.
When did you start working on Far Cry3? Which parts were you responsible for?
The cinematics team were involved in a lot of things, such as helping the game-team to visualize some aspects and we storyboarded and directed the motion capture for the in-engine co-op cutscenes. We also did the Far Cry 3 Outpost app video. The main thing though, was the introduction cinematic to the four co-op characters, Tisha, Yuriy, Callum and Leonard, which ended up as a pre-render project instead of in-engine.
Can you give us some statistics on the videos you worked on, render times polygon count, etc?
Wow... where to start? :D
A typical frame for the Co-Op intromovie was around 10 min for the environments in 720p, and another 10min/character, of course, very dependent on the framing. So, the average was around 20 min for the 3D but this doesn't count the FX though, but I know that the fx-guys are very good in optimizing their rendertimes. Unfortunately, I do not have any numbers of their average rendertime.
The heaviest scene was the Kitchen... that environment was basically two things; blurred reflections and GI. Not the best combination if you want to render fast and it leaves very little room to optimize, except for breaking things up in lots of layers. We didn't do that so some of those frames could take up to an hour on an i7 920 @ 2.8GHz (this is a low end i7 btw).
What really helped to shave off rendertimes for the characters were db&w's Spherical Harmonics Light, so, essentially, we rendered a keylight pass and an SH-pass and outputed buffers through exrTrader (another of db&w's products). Spherical Harmonics is leaps and bounds faster than any GI, but gives a close enough result to be passed on as true GI; the best thing with it is that you never have to worry about GI-flickering. Janus was also a key part of being able to manage the renderlayers and what buffers to output depending on the pass.
The polycount was quite low in most environments... around 3-4 million polygons in the heaviest of them. The characters were taken directly from ingame and we used pretty much the exact same textures.
What does your pipeline look like?
We are using a bunch of different tools for their respective strenghts. For animation and motion capture, we use a combination of Maya and Motionbuilder. For FX we use 3DS Max with some plugins such as Thinking Particles, Rayfire and FumeFX. For general 3D rendering we use LightWave where Janus and exrTrader being the key tools, and Modo.
To exchange data between all the applications we use PointOven which gives tools like Maya and 3DS Max the ability to load and save .LWO and .MDD files (LightWave and Modo can already andle these formats natively). PointOven also facilitates a very good camera-matching functionality, that FBX sometimes fails to deliver.
"We are using a bunch of different tools for their respective strenghts."
This means that we seldom have to duplicate our efforts, since all packages can load the same data. Shaders are obviously not compatible for many reasons, but it doesn't pose a problem.
For animation, what they do is that they bring the data from Motionbuilder to Maya where they bake out MDDs. Since MDDs are referenced and not actually loaded into any scene, all applications have now been updated with the new animation. We call it "The Ripple Effect".
The FX-guys seldom need any textures, so they render their fx-lightcontribution to a gray diffuse pass and then utilizes the Raw Color buffer outputed from LightWave in comp to add in their lights. Sometimes, we need to get their lights into LightWave for a proper fx-lightpass, but that is, generally, not the case, and it is not a daunting process to get the data into LW in those cases it is needed.
We are also investigating the .MOT format and working on an implementation that will allow it to reference keyframe data at loadtime, just like MDDs does it for us right now, across all applications.
I know you are an avid user of db&w software, which plugins are you using?
exrTrader, db&wTools, db&wBOT (not avaliable to the masses yet) and infiniMap. From those, exrTrader and db&wBOT are the ones that we use in all projects.
Haha, that's cool. db&wBOT are tools fresh from the development, they might make it into products later. At the moment they are only available on special request and mostly hidden from management. ;-)
What couldn't have been done without our software?
We have been working hard on establishing a good pipeline between LightWave and Nuke. Without exrTrader it wouldn’t been possible to make it as effective as it is, regarding the buffers and outputs we use (some of them had to be remapped/offsetted to fit Nuke) and exrTrader facilitates that quite well. What they had to do in Post with certain buffers can now be done directly when the buffer is saved out from LightWave, and that is valuable.
You guys have also been very forthcomming and responsive to us when we needed it, so that is also a HUGE plus!
Thank you for the compliment. We are always happy to help and we know that at times you just need things right now and not later when you are in production.
Do you have any feature requests?
Lots of them. :D The number one thing that I would want to do is to create my own buffers that exrTrader puts out. Right now, some things that we would want as a buffer requires a new renderlayer in Janus. Not a big deal, really, but... you know... if something could be a buffer instead of a pass, than that would be really nice!
Oh... and please... bring Deep Compositing asap to LW! ;)
Deep Compositing has done its first step into exrTrader even during SIGGRAPH 2012 and we plan to finish this until SIGGRAPH 2013 in Anaheim and custom buffers will be there, too.
We know that you're a very active member of the LightWave 3D community as "cageman" and have often published a few workflow videos.
How does the community compare to those of the other packages you work with?
The LightWave community feels more like a family compared to any of the other apps. Within this family, people tend to share a great deal of information and help each other. If you post a question on the forums, you will have a couple of people spending quite a lot of time to answer. It has less of the "RTFM mentality", which helps create a creative atmosphere. Of course, there are situations when you never get an answer, but that is most likely because no-one knows. :)
"The LightWave community feels more like a family compared to any of the other apps. "
Another thing that I've noticed is that most of the Senior LightWave-talent knows a couple of the other applications as well as they know LightWave, such as Max/Maya/Motionbuilder, and that is extremely good when building a small but versatile and strong team.
Let's get back to Far Cry 3, which is certainly very impressive. Which parts of it were develope at massive in Sweden?
The online parts; Multiplayer and Coop... though, Coop also supports a Split-screen two player mode. :)
First person shooter and real time strategy games
Ubisoft Massive has never done a FPS before. You come from real time strategy games like "World in Conflict" and "Ground Control". So where do you see the difference?
Well.... World in Conflict does flirt with the FPS world. The camera navigation is based on the WASD standard and it focuses on intense action rather than building bases and resource gathering. At the time of release, it was probably the most versatile RTS-engine since you could zoom in and follow a single solider, almost like an FPS with a third person camera.. and the realtime in-engine cutscenes were closer to what one expected in an FPS at that time.
That said, there are tons of things that are very different as well... Graphics, for example, requires more fidelity in all aspects (textures, mesh-lods etc). Animation is usually much more demanding, not just for in-engine cinematics, but the enemies you meet in an FPS usually needs a lot more animation states, loops, separated upperbody/lowerbody systems and so on.
How did the integration of elements of a multiplayer game to the single player game work?
Since the same engine was used and synced, Massive had access to exactly the same stuff as the team in Montreal had, and vice versa. As far as characters goes, for the Multiplayer part, Massive just used what was already there, and focused on designing and creating a bunch of multiplayer maps and play modes, and of course, the COOP campaign.
How procedural is Far Cry 3, especially when I think of world building?
To be honest, I have no idea. :) We never touched those tools in the cinematics team.
Do you think games are an art form?
Absolutely, yes... I would say that it isn't just a singular form of art; it is a collection of many different art forms that, at the end of a good day, creates a game that many wants to play and have fun with.
Do you think there is a chance for games to climb the cultural ladder?
Yes... I think it already is climbing. When I started out on the Amiga many years ago, you were a geek in a bad way if you played games. That isn't true anymore.. You know... being a geek these days has become somewhat cool. Many games today also have social networking as part of the gameplay or game feature, which makes them more appealing to a broader audience.
Do smart phones change the way of game development or are smart phones just another platform?
Both... Most games on the phone-market makes me look back in the mirror... to those C64/Amiga days, where very small teams, or even individuals, created games that sold very well. A mobile game today, is pretty much that; done by small teams or individuals. This, in turn, makes a game extremely low cost to produce and are usually done in months rather than years. Even if you completely fail and have zero sales of such a game, you didn't loose a lot of money.
While smartphones and tablets are seeing a reboot for a lot of individual or small team developers, it also opens up opportunities for those who creates the AAA-titles... not in the sense that a game like Far Cry 3 would be ported to such a device, but, companion apps are certainly going to show up. I mentioned our Far Cry 3 Outpost app video earlier, and that video is a good example of how smartphones can be used for AAA titles.
Far Cry came out in 2004, Far Cry 2 in 2008 and Far Cry 3 now in November 2012. Can we expect Far Cry 4 in 2016?
I would love to play Far Cry 4. I hope you are right!
Can you tell us what are you working on right now?
So much to the company secrets we wanted to hear....
One of your game designers said: "Everyone is tough when it is sunlight, but as soon as the sun goes down they start crying for mom." So we thank you for the interview, wish you a lot of sun. And instead of crying for mom, you can always cry for us and ask us for new software for help.
Haha... yes.. and trust me, we will be bugging you guys in the future... ;)
Oh, since I know you are a musician, please don't forget to tell us about that and how we can listen to your tracks.
I have a few of them on Soundcloud, but they are WIP and some of them have progressed quite much since I posted them online. So, now that you've been warned... search for Mikael Burman on Soundcloud.
I do have a plan though to release an "album"... probably during this year. :)
We have heard some of your tracks and we think they are cool. We wish you good luck with your album.
Thanks again, Mikael.
Thank you guys!