CINEVERSITY

  • [Help] Introduction

    View Tutorial

    Thank you for taking the time to report your issue with this Cineversity tutorial.
    We apologize that you encountered an obstacle with the material. We’ve created this template to help you clarify the issue you are having so we can assist you more quickly.

    What's the specific issue you are having?

    Please include a timestamp (where in the video the problem occurs) if applicable.

    What are the steps to replicate the problem?
    What steps are you taking that seem to cause the problem?
    What version of the software are you using?
    Any and all info is helpful. Thank you.

    If the question is not directly related to the tutorial, please refer to the Q&A forum

    posted in Tutorial Discussions
  • RE: Shader field UV projection

    Hi sweet-boyfriend,

    Thank you for using Dropbox.

    I used 2024.3 to test this.

    Yes, if I follow your steps, then 2024.1 isn't working. Even I use flat projection, which I would use for a plane, not a cube. Either way, it isn't working in the older version.

    2024.3 works here.

    All the best

    posted in Question & Answers
  • RE: Shader field UV projection

    Hi sweet-boyfriend,

    Thanks for the file, perfect. I had no image from you; the file entry was a JPG, so I used that format to recreate it.

    Yes, I can reproduce that. I never had a Shader Field in a Shader Effector, while the Shader Effector had no data (image) in the Shading entry field.
    (I would have used a Plain Effector in that case.) White means 1.0 as value.

    During the speed optimization, many things have been rewritten. I just guess here that this might have caused the change.

    Anyway, if you need that, do this:

    Place a tiny white image into the Shader Effector> Shading, Shader field. Or, just use the Color Shader set to white if your renderer works with it.

    Example
    https://stcineversityprod02.blob.core.windows.net/$web/Cineversity_Forum_Support/2024_PROJECTS_DRS/20240228_CV4_2024_drs_24_MGsm_01.zip

    All the best

    posted in Question & Answers
  • RE: Changing the scale of character objects that have been bound and weighed

    Thanks for the reply, series-tune.

    That two-step process before and two steps after, with the adjustment session in between, is the shortest when discussing reliability.
    Please note that I haven't seen your rig, but I got no information that something other than Joints is working on the surface.
    Look at what tags are added to the rig; PoseMorph, XPresso, and constraints might not work well with scaling.

    There is always the option to scale the whole project…

    Not a suggestion! -> Something quick and dirty, just "dump" everything under a null and scale the Null.
    However, when something else is working on the rig, that might backfire, and the deadline is gone.
    So, why do I mention it here? I see people share that as best practice from time to time. I stay away from it, far away!

    Sorry for the specific cloud service request. I know it isn't enjoyable, but I must keep it safe here for everyone.

    My best wishes for your project

    posted in Question & Answers
  • RE: Packing Flooring Boards

    Hi Greg,

    Thanks for the feedback.

    My best wishes for your project

    posted in Question & Answers
  • RE: Issue with "white" not being all "white"

    Hi RamoGambacciani,

    One thing that influences this is:
    The scene is set up with no light, so it is frontal illumination. Place a sphere with the white material, and the center will differ from the (tangential) borders. If the light wouldn't work that way, it will look like a 2D flat-colored image.

    The central part that will alter it is based on ACES and our full dynamic use of data inside the apps.
    Redshift 3D has no option to work in a 0 to 1.0 space, as it aims to be photorealistic, which requires allowing values above 1.0.
    To solve this problem, I must say that there is a common misconception that 100% of all sRGB values are white. That is the point where all channels are clipped, and that results in white. I think any sRGB/integer material should be avoided. I hear you saying, but the example has no saturation at all, so let's talk only about white.

    If you take a white piece of paper and put a light on it, it will never reflect all the light back. Now think of a glossy porcelain cup, exposed so the white material is 100%; what will happen to the reflections from the bright window? As the white is already 100%, it will be clipped. So, what is the white you need in your image?

    If you want to do tests with this method, skip the reflective approach, set a surface that produces 100% Emmision, and do nothing else. So you can focus on that value along your pipeline.

    White is not defined anywhere; if you search in color science books, it is not simple to explain or pinpoint. Which color temperature has it, and what value is considered white, the specular highlight, or the reflection of a super bright studio lightbox?

    The main idea is to set an idea about Diffuse white based on the exposure we get when setting up a scene with a Gray Card. From there, we can say that the light used for the gray card will produce a "Diffuse White" value when swapped out with a white card (dull and not reflective). We often have cards with a black, gray, and white field. Whereby black is typically not zero!

    We leave space for super Whites and Specular values to have a Diffuse White.

    There is more, but I can't squeeze into this post a whole book. For example, how humans turn frequencies into color and images etc.

    With that said, let's look at your settings. Thanks for the file!

    In your settings, you have the Project settings to sRGB, and the file is a PNG.

    1
    The ACES is in the Redshift 3D Render Settings. My tip is to toss that and set ACES in the Project settings.
    2
    Cinema 4D and Redshift 3D are working in float and, with this, handle High dynamic range content.
    With PNG, that is pretty much all gone, and you get the data squeezed into a tiny container. Here, your trouble starts.

    We always need to check if that is a value in the images stored or something produced on your screen.
    That would be the case for both with PNG and no HDR display.
    To work better, ACES is built as a Scene referred pipeline, meaning that content is no longer changed completely to fit into a small screen "space" (Gamut and Dynamic). The Display has become a branched-out signal, but that signal is no longer the main pipeline—just an observer. The main data is fully available until saved into a small format.

    In this case, a Tone-mapping is applied to avoid clipping values above 1.0. To Tonemap a larger dynamic into the small space of a PNG 8bit/channel, some space must be available for values above 1.0 or 100%. With ACES 1.0 SDR Video, that white you like to have as 100% needs to be 1630% in the scene, as that will get squeezed down to 100%. The white material in your scene must face the light perpendicular to reflect most of it, but it doesn't. The white is then tone-mapped to something that looks grayish.

    You can set the Compensate for Viewtransform to off and the View to Unone mapped in the Redshift Settings. This should give you a while that is illuminated to give you 100%.

    Here is one part of the problem: logo colors do not survive the export to SDR very well. Again, one, but not all. But that goes deeper into Colorscience than I like to discuss it here. I work on something that hopefully clears all the false data in so many sources, missing all components that work on this professional pipeline.

    In short and with the limitation of an example, ACES works like Raw formats in cameras, and producing 8bit/channel PNG or JPGs are working as such in the same way; the camera squeezes as much data into it, hence the shifts in it, and why no working photographer enjoys jpg as format during the pipeline.

    You have either a cut out from an HDR 0-1.0, and anything above is clipped and appears white while it had perhaps color in it before, or you tone map and get a more naturalistic representation of our perception. Both at the same time is not possible.

    A word about ACES: it is not a look, nor tries it to be one. The Filmic appearance comes from that limited export, but we are long past an only SDR workflow, and most people (AFAIK) already have an HDR TV or might think of buying one. Producing in SDR for this target group will look dull at one point, as I have pointed out over ten years now in the Cineversity Forum.

    I hope that helps a little bit.

    If not, please have a look here:
    https://prolost.com/blog/aces

    All the best

    posted in Question & Answers
  • [Help] Extreme Extruded Cube - Part 2 Animation

    View Tutorial

    Thank you for taking the time to report your issue with this Cineversity tutorial.
    We apologize that you encountered an obstacle with the material. We’ve created this template to help you clarify the issue you are having so we can assist you more quickly.

    What's the specific issue you are having?

    Please include a timestamp (where in the video the problem occurs) if applicable.

    What are the steps to replicate the problem?
    What steps are you taking that seem to cause the problem?
    What version of the software are you using?
    Any and all info is helpful. Thank you.

    If the question is not directly related to the tutorial, please refer to the Q&A forum

    posted in Tutorial Discussions
  • RE: Packing Flooring Boards

    Hi Greg,

    There is no default option for this.
    It needs some work to create (minutes), but then you can have a wide variety.

    First, set up a standard board and clone it ten times in a row.
    Current State to Clone, then Connect and delete.
    Create copies from it. Move the neighbor points in the line direction.
    I created each line of ten boards differently. I used the Rectangle Selection for it and then moved the points.
    Check out the files Cube 0.0 to Cube 0.4, which are individual versions. Cube 0.5 to Cube0.9 are copies.

    This goes under Cloner, set to Blend, on Clone (yes, just one. This gets the Random Effector, Noise, and Modify Clones, Uncheck Position (Parameters)
    The Single Clone Cloner is then a child of the Cloner that sets the long "Ten Tile" lines as an entire floor.

    The Random Effector will not blend between them, so the dimensions are each time different.
    Explore the Effector Tab section of the Random Effector in the file.

    Materials can be set via Selection Tags. Typically, they don't blend, but that is another theme.

    … and yes, real random leads sometimes lined up boards. Never matching an edge is not full random 😉

    https://stcineversityprod02.blob.core.windows.net/$web/Cineversity_Forum_Support/2024_PROJECTS_DRS/20240228_CV4_2024_drs_24_MGfl_02.c4d.zip

    Screen Shot 2024-02-27 at 8.40.28 PM.jpg

    Cheers

    posted in Question & Answers
  • RE: Changing the scale of character objects that have been bound and weighed

    Hi series-tune,

    As mentioned:
    Please share the file if you like (DropBox, Google, Apple, Adobe, or Wetransfer, full URL, not tinyURL, etc. Thank you; it keeps things safe here.)

    I will check later.

    posted in Question & Answers
  • [Help] Fractured Cube - Part 2 Animation

    View Tutorial

    Thank you for taking the time to report your issue with this Cineversity tutorial.
    We apologize that you encountered an obstacle with the material. We’ve created this template to help you clarify the issue you are having so we can assist you more quickly.

    What's the specific issue you are having?

    Please include a timestamp (where in the video the problem occurs) if applicable.

    What are the steps to replicate the problem?
    What steps are you taking that seem to cause the problem?
    What version of the software are you using?
    Any and all info is helpful. Thank you.

    If the question is not directly related to the tutorial, please refer to the Q&A forum

    posted in Tutorial Discussions