top of page

Major Project

Progress so far
Phase 1 Testing

As documented in my Research Journal I have done three test shoots so far. For my first test shoot I used different coloured tape to create multi colour bands and then drew black dots in different sizes onto them.

testshoot1.png

We shot a few takes of the scene where the actor walks across the room and picks up something from the table. After that I took reference images from the scene in general to improve the track later in 3DEqualizer and I also took a lot of images from the taped up arm to help me solve the object track.

The idea behind this was to take a picture from every possible angle to get multiple reference images for each dot.

testshoot1_1.png

After opening 3DEqualizer I realised right away that this approach was definitely improvable. The black dots were way too small to track and the yellow tape was highly reflective. Nevertheless, the scene track was fine and even the arm with the patches tracked all right in the end although the deviation ended up quite high.

The only real problem turned out to be the reference images of the arm. Because the arm had been rotated between photos it turned out to be impossible to get solid 3D data to use.

 

Lessons learned:

Although I wasn’t able to track the arm as I planned, it wasn’t a bad start for the first test. I learned that I need better tracking markers/bands for the next test to begin with and a new approach for the arm reference images.

Screenshot 2019-02-10 at 23.00.35.png
Phase 2 Testing
testshoot2.png

For my second test shoot with the help of Rich I printed out black and white marker bands with big symbols on them.

testshoot2_1.png

I also changed the setup of the shoot. With the help of Rich i used two Cameras, one as the main camera and one to use as a reference/witness cam to get additional data for the track. The cameras were gen-locked and timecode was synchronised.  I put a lot more then may have been necessary tracking markers on the floor and the back wall to help me track the scene later. The first shoot was also part of an assignment for the MatchMoving module which is why I needed to clean up all the markers in the scene, hence was careful to not use too many. This was a test shoot only where I didn’t need to worry about clean up.

testshoot2_3.png

At the second test shoot we had the option of shooting with 60fps and 720p or 30fps and 1080p because of camera restrictions. We chose the first option because the frame rate seemed more important for the quality of the track, but it turned out to be the wrong choice. The quality wasn’t good enough for most of the sequences we filmed. When I zoomed in on the tracking markers they went so blurry that it was not possible to track.

testshoot2_2.png

Luckily we filmed different takes, in one of them I move with the camera towards the actor and therefore being closer to the tracking markers. So, that’s the sequence I chose to work on. It was still blurry when zoomed in but workable.

testshoot2_4.png

The scene tracked perfectly with the data of the two cameras and the arm track turned out better than expected too.

testshoot2_5.png

The biggest improvement although turned out to be the new approach for the reference images of the arm. This time the actor tried to stay perfectly still while I orbit around him with a camera, trying to get all the markers from different angles. This turned out to be a way more effective method than what we did in the first test shoot. The 3D data resulting from these images formed perfect circles and looked perfect.

 

Lessons learned:

The most important lesson from this second test was to always! Shoot with 1080p, otherwise the quality is just not good enough to get a proper track. 

This is a screen recording of the second test created before the Christmas break. It's a bit shaky at some points, but I'm generally pretty happy with the outcome so far. I hope I can fix the last shaky bits in the Graph Editor in Maya.

And this is a screen recording from the tracked scene imported into Maya. I tweaked the animation a little bit in the graph editor for certain parts of the track where it goes wild and the track looks pretty solid now.

Screenshot 2019-02-17 at 21.46.35.png

As you can see in the screen recording above, the track looks pretty decent but when I tried to attach geometry to it the geometry would always rotate the wrong way. Looking for the problem I discovered that in 3DEqualizer the track also looks ok, but somehow the bands are flipped 180degrees. As shown in this screenshot they are bent towards the back wall as if I was tracking the back of the arm. I have no idea why but if I can fix this then everything would work pretty much. 

Phase 3 Testing​

I've started my third test now and it's basicially the same setup as the second test, but I've used three cameras (1 main, two witness, all full HD) to get more data. I've also used a digital clapperboard to sync the three cameras.

Screenshot 2019-02-17 at 21.41.33.png
Screenshot 2019-02-17 at 21.37.03.png
Screenshot 2019-02-17 at 21.37.49.png
Screenshot 2019-02-17 at 21.37.29.png

Main camera

Witness cam 2

Witness cam 1

Screen recording of the third test so far. The major crazy parts are easily adjustable in Maya with the graph editor so the track isn't too bad. But I've got the same issue with the flipped rotation of the armband again.

Previz

For my previs, I originally planned to film the performance of my dancer and edit the video as that would have shown my entiere project without the CG part. I've been in contact with the UWL dance society but unfortunately, they aren't too responsive so I haven't had a chance yet to film the performance. 

That is why I've used part of a video I found online which shows ruffly in what direction I want my final video to go. I have highlighted one arm (I doubt I can do both arms & legs, I rather do 1 arm perfect then all 4 limbs rushed) which I am planning to replace with a CG one. 

Because the original video isn't my work I can't make this video public so you have to click on the link.

Password: previs

Final Shoot

I had to change my idea because I couldn't find someone who agreed to dance in my video. My classmate Azad agreed to participate in my video. The idea changed from dancing to just film different shots of him starring intensely into the camera.

We shot with 3 cameras, multiple lights and used a digital clapperboard to synchronise the cameras after the shoot. 

AzadShoot.jpg

I wanted to make sure there is enough tracking data so I used a lot of markers which I taped onto Azads arm (watch was also taped down to stay in place). I never intended to track all the points but it's good to have them there if I need them.

armReferernce2.JPG
armReference1.JPG

These are a few shots I selected to work on for my final project. I planned to finish one of the shots completely to see how long it takes me and then see how much time is left to finish one or multiple more shots. 

First thing I did was a 3D stabilization technique in nuke because most of the shots are handheld and pretty shakey which I thought didn't look very professional. The stabilization didn't work out perfectly for all the shots, but I'm not sure how many shots I can work on anyway so it's fine for now. 

3DEqualizer

The first task was to track the shot in 3DEqualizer. I used loads of reference images and measured a few distances to create distance create to make sure the room/camera solve would be good.  I used the last clip from the video above because the stabilization worked best on that one and I wanted a shot with a moving camera in my showreel.

roomReference01.jpg

Those are the notes I took including the measurement of course. 

roomReference02.jpg

As you can see in this video the room itself gets reconstructed fine in 3D space but even after loads of parameter adjustments the camera still doesn't solve as it should. It's supposed to be a smooth arc. 

The object track doesn't work as well but that doesn't affect the camera solve in a bad way as I tried it with and without (in fact it should help the camera solve)

I've never had this problem, as far as I'm concerned all the data looks perfectly fine and the camera solve should be solid. I spent two weeks trying different things to fix this and finally had a breakthrough when I noticed that in most of my versions the deviation spikes at the end of the shot. So I looked at the footage again and noticed that in the last 50 frames in the main camera there is almost no tracking data available because the few markers that are still there are hidden behind the person and it's just 1 track that remains visible at this point. I deleted those 50 frames and like magic everything worked more or less, which was pretty upsetting because I wasted 2 weeks and in the end, it was an easy (kind of obvious once you know it) fix.

The stabilization I did in Nuke messed a bit with the video, it's not a perfect arc in the end and it has some weird motion blur at some points which I'll show later.

After I deleted those frames I started doing all the parameter adjustments again to get to my final outcome. I also changed from an Object Point Group to a MoCap PointGroup because after explaining my problems to the 3DEqualizer costumer support they suggested that MoCap will probably work better for this project.

These are all the things I adjusted:

  1. Disable everything but 1 camera and point group camera, disable camera synchronisation, calc all

  2. Adjust parameters focal length and distortion for main camera, calc all

  3. Disable main camera, enable witness cam, repeat step b

  4. After all cameras (parameters) have been adjusted individually enable everything again and synchronise cameras, calc all

  5. Calculate timeshift of two witness cams, calc all, view → show rolling shutter/timeshift

  6. Adjust all paramters together, calc all

  7. Increase weight of main cam to 100 (main cam is the most important one, needs to influence the calculation more than the witness cams.

  8. *MoCap group attributes --> 'Affect Calculation of Synced Cameras' --> Disable (with this enabled the natural noise of the moCap points affects all the camera points and makes them shaky)

  9. Camera Point Group Attributes --> Postfilter Mode --> Off (fixes shakey Mocap points - don't really understand why [if all points shake similar it's a problem with the point group, not individual points])

*Points 8 and 9 didn't work together for me, I still had either shaky camera or shaky MoCap points, so I decided to skip step 8 and have perfect MoCap points (which seemed more important to me) and keep the slightly shaky camera points.

I followed this video (and the second part of it) for the most part of my 3D Tracking: https://www.youtube.com/watch?v=WtzMHTdViA8

As you can see in this video, the MoCap points stick perfectly to the markers, deviation is really low and the project is ready to be exported to Maya.

As I said before, I never intended to track all the points on the arm. I realised that the upper arm isn't really moving in this shot, so I figured I don't need a lot of points there. I can always come back and track more points if necessary. 

Maya

I realised pretty quickly that I'm not good enough in modelling to model a convincing/cool robot arm that works for my project and since my project isn't about modelling (and I don't want to be a 3D modeller as a future job) I didn't want to waste too much time on the actual arm.

arminspo.png

I found this picture on Instagram and it inspired me to make an arm with MASH in Maya. MASH is pretty easy to use and you get really cool results really quickly.

armOG.png

This is the base arm I used as an input for my MASH network which started as a single cube. Before I started with the MASH I imported my script from 3DE to get all the locators/camera/imageplane.

Mash01.png

After connecting the arm to the MASH network I only distributed the cubes on 'Face Centre' and ticked 'Flood Mesh' to create enough cubes for the arm. That's literally all you need to do to get loads of cubes in an arm shape.

Mash02.png

To make it look a bit better I also added a Random node to the MASH network and played around with that a little bit. 

Next, I rigged the arm (the MASH created mesh is hidden here) and decided to put 3 joints in the forearm to get more flexibility and to move the middle of the arm individually. I bound the skin and tried to place the arm to where the MoCap Locators are, which turned out to be really tricky.

Armrig.png

I created spheres for each joint and put them in the place where the arm is just for me visually. Then I parented the joints to the spheres and the spheres to some of the MoCap Locators which gave me the basic movement of the arm

MayaSpheres.png

This is the finished arm that's already connected to the MoCap data.

MayaArmFinal.png

Quick demo of the final arm, how it's deforming.

Quick demo of the arm movement. There is still a bit of Keyframe animation to do but it already works really well.

Lighting

I took reference photos after my final shoot of the lights we used to recreate them as accurate as possible in Maya (some of the lights were already taken down at this point but I took notes as well).

lightReference01.jpg
lightreference02.jpg

I tried to rebuild the room and the lights as accurate as possible without spending too much time on it. My E-Mentor Addison Petrie suggested putting a rough body model into the scene to get more accurate light spill shadows on the arm.

All the geometry has assigned textures as well to recreate the right lighting conditions.

MayaScene.png

Close up of the body model which I also assigned a shader to, to simulate the t-shirt my performer was wearing.

MayaFigure.png

I used Substance Painter for my texture and chose a preset called SciFi_PVC and played around with it a little bit till I was happy. Initially, I wanted a black or grey colour but it turned out to be too dark overall. With MASH I only had to apply the texture to the initial cube I used to create the MASH network with and it automatically applied it to all of them.  The first version I rendered to use as a test turned out to be a bit too dark/too many shadows so I had to add another light.

This is the final outcome rendered with my light setup:

first version

ArmOld.png

final render

ArmFinal.png

Final Render of my Arm Movement. On its own, it looks a bit weird in my opinion but with the footage in the background, it works. There is a weird shadow going on towards the end which turned out to be caused by my keyframe animation apparently. I'm not sure why it affects it so extreme but the cubes at that part suddenly tilt in a way that they cast a weird looking shadow. I tried to fix this but there weren't any keyframes during that particular frame range so in the end, I had to use this version because I ran out of time.

Nuke

I expected the composite part of my project to be not that big, just a bit of roto and cleanUp but it turned out to be more difficult to make it look really good. And this is also kind of the most important part to really finish everything off and just get those last few per cent to make the effect believable.  

This is my final Nuke script. The main things I did was:

  • integrate the arm into the scene

  • clean up a bit of the upper arm/sleeve/chair

  • roto the arm to fit into the sleeve and the watch

  • create/animate shadows for the watch and the sleeve

  • remove the first bright reflection of the watch and recreate the original bit of reflection

  • recreate the second light spill of the watch on the arm

  • match the random motion blur

  • clean up the markers

  • recreate the frizzy hair that's lost under the cleaned up markers

  • clean up the top bit that's messed up from the 3D stabilization

  • redistort the footage

Here is a before and after of all the major steps:

Frame 1089

Before: Original Plate

After: Added CG Arm

Before: Original CG Arm

After: Grade/Defocs/MotionBlur/Grain

Before: Grade/Defocs/MotionBlur/Grain

After: Arm rotoed

Before: Arm rotoed

After: Shadows sleeve and watch

Before: Shadows sleeve and watch

After: Recreate watch

Before: Recreate watch

After: Recreate watch reflection and colour

Before: Markers

After: Markers Clean Up

Frame 1115

Before: 

After: UpperArm left side clean up and chair + part sleave recreation

The next few images are increased in gamma for better visualisation

Frame 1102

Before: 

After: Cleaned Up Top

Before: 

After: Recreated hair patches

Edit: I just realised you need a premium version (which isn't free) of these before/after sliders to show more than two, that's why they are not showing on the page.

And finally, this is my finished final project + breakdown. In the end, there wasn't enough time to do more than one shot (the breakdown of this shot took a lot longer than expected as well) but I'm quite happy how that shot turned out.

bottom of page