I’ve added some more details to the palisade asset. First, I added some UV’s to allow for texturing. Here’s a view with some stand-in textures; for the final product, of course, it will need the same stylized, painterly textures that are being used for the other assets. But this at least provides a sense of how it might come together:
I also broke up the ends of the horizontal braces, so they aren’t perfectly squared off, but seem a bit more ragged or splintered. Then I doubled the lashings holding each brace to the upright posts, and added some more variety to that geometry: different colours, thicknesses, etc. Here are untextured and textured views:
Procedural UV’s are always challenging; I’ve tried to make sure the UV seams on the cylindrical pieces of wood are always on the ‘outside’ of the palisade structure, so they won’t be visible from inside the compound. We’ll have to see if this is a successful approach, or if we need to find a different solution.
My next assigned item is a drying rack. I immediately decided to make an orthographic drawing based on the photos provided. I wanted to make sure the scale was right. The hide is an estimated 4 feet, so the drying rack has to be 7 feet. Once this drawing was done, I took it into Maya to start modelling! I started with the branches, making sure they were strong, but also had some natural curves and sway to them. Once they were all aligned properly for the structure, I began to ‘secure’ them together with some string. The string was likely made from natural materials, so it had to be mismatch and imperfect. This is why I made them with varying thicknesses, and also used different knot styles since more than one person may have constructed the rack. In Mudbox, I added textures to the wooden branches and the centre of the hide where it wouldn’t have been stretched as much.
I’ve continued my work on the procedural palisade. I corrected the shape of the uprights to reflect the fact that these were essentially entire small trees stripped of branches–not thick posts sharpened to a point.
I began testing out the procedural generation using a few different input shapes: a line, a circle, and an arbitrary curve:
Note the last test shape, the curve, travels up and down as well as snaking along the ground. It will be important for our palisade to work properly across uneven terrain.
I also devised some methods for adding the horizontal braces and a simple version of the lashing that holds everything together. I built in a switch to make it easy to flip the inside/outside direction of the palisade:
Finally, I updated the asset in Unreal to see the results in the game engine. Not bad as a starting point!
The next step will be to figure out how to use an input curve from Unreal to control the path of the palisade. And after that… finally… I’m afraid I won’t be able to put off dealing with UV’s any longer. Ugh!
The other day I went through the process of modelling an axe with our CG modelling team. The model was based on existing references assembled by the research team. I did have a few initial questions on how the axe head connected to the handle. Was it fastened to a wedge-shaped end, or was it inserted into an opening like a thread through a needle? It turned out to the be the latter.
The modelling process was recorded and the session is posted below.
Another aspect of this week is to develop a simple workflow that will give us the ability to produce assets of similar quality. Consistent scale will be vital, so we’re not guessing how big or small objects are in the game engine (especially if we’re dealing with hundreds of objects). I created a proxy character that is approximately 5’8″ that we’ll import into Maya before modelling begins to make sure every object is sized accordingly.
At the moment, the order of events from beginning to end will go like this:
Model high resolution model to get approval from the community (detailed in Mudbox or Zbrush).
Retopologize the model to create the highest resolution mesh to be used as an in game asset, an LOD (level of detail) of 1. This is then UV mapped and sent to Substance Painter for texturing.
In substance use the created shaders to maintain a consistent look across all assets. There will be a shader for every type of material (rock, wood, rope, etc.). The hue of the texture will need to shift in game to differentiate between indoors and outdoors.
In Maya set up the initial shader and create the lower levels of details.
Send to Unreal (I’m still looking into the best approach for this). I’m hoping the “Send to Unreal” button works the way it’s supposed to.
The wrapping is baked into the handle to reduce the overall amount of detail. Using the colour masks in Substance help that element maintain structure. Otherwise, it seems to disappear with the painterly effect.
I also tried out the LOD (level of detail) set up in Maya. I had to make sure the transition from the denser models visible closer to the viewer would be seamless as it moved further away into the distance. It turns out it was reasonably seamless, thanks to the textures bridging the transition. I also discovered (thankfully) that the poly reduce retains all the UV information. This will let us generate the LODs relatively painlessly, literally in seconds. For more information on LOD’s in Maya check out this link: Edit LOD threshold values | Maya 2016 | Autodesk Knowledge Network .
I am–I think!–the ‘Houdini expert’ Kris mentioned earlier. Now that I’ve joined the project, I’m looking at ways to leverage Houdini’s powerful proceduralism to help create the necessary assets for this project.
A starting point was to ensure we could use Houdini and Unreal in combination with one another. This is possible thanks to the Houdini Engine plugin. Today I got that installed and tested.
I’m using the palisade for my trial object; it will be useful to be able to simply draw a curve on the ground in Unreal and then have Houdini’s procedural engine generate a detailed model based on that. I’m starting with fairly simple stakes as stand-in objects, but eventually we will be able to replace those with much more detailed models.
Below you can see the palisade asset in Houdini and in Unreal, showing how a custom control called ‘length’ can be used to generate a shorter or longer section of the structure.
I started by bringing in the height reference and importing the obj of the 3D scan
2. Next, I made it live and started going about using the quad draw tool.
3. Once I was done going over the whole surface, I “relaxed” the edges. I got out of quad draw mode, hid the original references, and was left with this low poly object.
4. I renamed my obj, and hit 3 to see the smoothed look. and its ready to bring into mudbox!
5. In mudbox, i increased the subdivisions so I would have more to work with.
6. I then used the sculpt tools and stamps to make it a real object, and not just something that exists online. Some of the things I did was add some light indents where the fingers likely would have gripped the stone. I also added some scrape marks around the centre of the stone, because thats where the pestle, spoon, or other tool would have been hitting the object.
Today I hunkered down and started to learn SpeedTree for unreal (https://store.speedtree.com/ue4/). Foliage will play a crucial compositional role in the new version. There is a lot of work ahead to get the look working aesthetically. I ran through building a tree from scratch, testing its lightmaps, and then bringing it into Unreal. For the most part, it’s pretty straightforward.
I began with a birch tree because of the graphic nature of the bark. I used photoshop to create the bark texture and then brought it into Substance Alchemist to make it tile-able. I’m a little stunned. It looks like Adobe just rebranded and changed the pricing structure for the Substance products. Substance Alchemist is now Substance Sampler (https://www.adobe.com/ca/products/substance3d-sampler.html).
I’m debating if I can get away with only the diffuse maps. This would also save time hooking up textures in Unreal. Due to the stylized nature of the project, I think it can use generic leaves for most trees with slight changes in their hue and saturation. Once I had an initial shape that was working, I discovered the “randomize” button. This feature will quickly generate variations of the same tree…which was awesome.
I then took everything into the Unreal Engine. It’s been almost 5 years since I used it last…so it was a bit of a learning experience for me. The last time I used it was on the el-Hibeh reconstruction (https://elhibeh.blog/).
The above image is the first pass (definitely ignore the sky). I need to explore a more illustrative direction…but I think this is going in the right direction. Using subsurface scattering seems to soften the overall look. Before calling it a day, the last thing I did was bringing in the longhouse and applying the shaders to the exterior pieces.
There were three things that I decided to work on today.
Begin to investigate approaches for procedurally generating components for the longhouse.
Continue to develop the painterly look inside substance painter.
Develop the beginnings of our research book shared amongst the various teams working on this project.
For the procedural aspect of this project, it would be excellent to randomize various sections of the longhouse. That way, we can make some structures shorter and some forms longer and randomize all of the individual components within each section. Elements that could be randomized would include: pot arrangements, throw skins and bed covering positions, firepit configurations, drying food hanging from the ceiling area, and even the posts and support structures. The last would need extra attention so that the hookups would align when the sections are duplicated. We would need 2 main components (see attached): a midsection piece and a vestibule.
This Maya plug-in does something similar to what I was thinking. However, since we will have a Houdini expert joining our team, we could perhaps this could be done more effectively (and with more control) in that software.
Moving on to the texturing, I finished the first pass of resurfacing the interior of the longhouse. Before going back and tweaking what I’ve done so far, I’d like to investigate a way of bringing everything into the Unreal Engine from Substance Painter. There’s a plug-in I purchased a few years ago that exports out entire shading networks to Maya called Substance Live Link (https://gumroad.com/xolotlstudio?sort=page_layout). I hope there’s a way to do something similar for Unreal because this would save a ton of work when hooking up all of the textures and materials. What I really hope exists, is a way of exporting the fully textured model from substance painter into Unreal (so that I can avoid the entire manual setup of the materials completely…but I think that’s just wishful thinking).
There are a few things that still need to be tweaked or missing. The fire logs have older UV’s so it was difficult to get the effect I wanted. I’ll need to export and update that another day. Also, the bed coverings are missing (I temporarily hid them) because the cut maps weren’t working as well as they could be in the display mode I was using. Here’s a slighting choppy flythrough from inside Substance Painter.
Now that we’ve discussed previous versions of the longhouse let’s look at where we’re at and how we’re going to move forward. This new version of the longhouse is being led by Namir Ahmed from Ryerson University. Longhouse 5.0 will realize all of the aspects that we weren’t able to incorporate into 4.0. With collaboration from the Huron-Wendat community, the concept for this new visualization will deal with three iterations of village lifecycles, past, present, and future. A typical village would be utilized for around 30 years before surrounding resources would be diminished. With this in mind, a viewer will begin in a vibrant mid lifecycle village. They can then travel to a village abandoned and is in a state of decay and can progress to a new village site where construction is only beginning. All three iterations will be pre-contact.
Another significant advancement in 5.0 will be the inclusion and representation of the village’s inhabitants. It was communicated to me that the lack of people in previous versions was somewhat unsettling. In 4.0, audio was added to infer that the area was populated: people conversing, children playing, fire crackling, wind, running water of the river, a change between inside and outside. One of the challenges of including people is authentically representing the indigenous groups. An idea I had was to utilize the new Metahuman tools (https://www.unrealengine.com/en-US/digital-humans?sessionInvalidated=true) that Epic Games has just released. Essentially the indigenous population would be able to craft their own avatars. The software is free and online for anyone to test out, making it highly accessible.
Using Metahuman has several other benefits. They are downloadable and editable. They are fully rigged to animate in the Unreal Engine. They’re characterized in case you need to use motion capture. It seems like a very promising option to populate the world.
With a.considerably larger scope and scale and possible problems with issues of the uncanny valley, we’re looking at a new style that moves away from the increasing photorealism of the previous visualizations. This could help speed up development, which will be beneficial given the timeline is fairly condensed. I’m going to be investigating a more painterly look over the next 2 weeks (leading up to full production) while I await the design direction of the community. The Miyazaki /Studio Ghibli style has a fascinating painterly look (found in such movies as Spirited Away). The challenge will be how impressionistic elements can get while maintaining a level of materiality found in real objects.
I’ve started to experiment with the existing longhouse in Substance Painter today. the results are promising so far.
Welcome to my new metablog. My name is Kris Howald, and I’m a Professor with Sheridan College’s Computer Animation program. I’m not sure how long these posts will be, but I’m going to try to do my best to document the process for developing the new version of the longhouse project.
I want to start by looking back a few years to the previous version of the longhouse, 4.0. In 2018, I was approached by a friend of mine, Dr. Michael Carter, to continue the development an existing VR Longhouse (http://theskonkworks.com/2015/07/longhouse-3-0-5/). The initial vision for this new iteration was to have a multi-structure environment that would indicate the scope of these types of settlements. However, due to certain constraints, the direction shifted to update the look and feel of the previous version. CG veteran, Craig Barr did an excellent job building the 3.0 longhouse. Yet, the technology of the time (around 2015) limited the polycount, texture resolution, and lighting to create a result that needed a slight refresher. I was hoping to add an extra level of realism to this version. To do this, I wanted to really focus on lighting, how light would interact with various materials, and introduce global illumination so that there would be more realistic light bounces within the structure’s interior.
I started by simplifying the number of objects by combining similar geometry, and then laying out their new UVs. Once I had fewer polygon objects to work with, I began to project the original textures onto the updated UV layouts. Once all of the original textures were transferred to the new geometry through baking, I then brought the new textures into procedural texture editing software, substance painter, and substance alchemist. This allowed me to really add the wear and tear the objects needed, and then the software separated out roughness metallic, normal maps, diffuse, height, and add in more customized features to give the materials hey more natural feel. With the new textures from the substance software, I re-applied them to my 3D model in Maya where I lit and then baked out the lighting with the proper shadows and GI. Below are examples comparing the model with and without the baked-in lighting
The following two movies are fly throughs of the model in Maya, one with the textures, and one with the textures plus lighting.
Once all that was ready, I brought it into the Unity game engine where I created the environment, adding trees, grasses, rivers, and addition natural elements.
Here is an example of the final 4.0 result compared to the previous version.
If you’re interested in additional information on longhouse 4.0, I will link an article in the Sheridan College online paper called the Insider.