three.js - Shaders

  • What is a shader?
    • Program written in GLSL
    • Sent to the GPU
    • Position each vertex of a geometry
    • Colorize each visible pixel of that geometry
  • Actually, Pixel isn't accurate because pixels are about the screen, each point in the render doesn't necessarily match each pixel of the screen, we're going to use fragment
  • We send a lot of data to the shader, vertices coordinates, mesh transformation, information about the camera, colors, textures, light..., the GPU processes all of this data following the shader instructions
  • Once the vertices are placed by the vertex shader, the GPU knows what pixels of the geometry are visible and can proceed to the fragment shader
  • Vertex Shader 頂點(diǎn)著色器
    • position each vertex of a geometry
    • the same vertex shader will be used for every vertices, some data like the vertex position will be different for each vertex, those type of data are called attributes
    • some data like the position of the mesh are the same for every vertices, those type of data are called uniforms
    • we can send a value from the vertex to the fragment, those are called varyings and the value get interpolated between the vertices
  • Fragment Shader 片段著色器
    • color each visible pixel of the geometry
  • Set up 基礎(chǔ)場(chǎng)景
  <script setup>
  import * as THREE from 'three'
  import {OrbitControls} from 'three/addons/controls/OrbitControls.js'
  import * as dat from 'dat.gui'

  /**
   * scene
  */
  const scene = new THREE.Scene()

  /**
   * test mesh
  */
  const geometry = new THREE.PlaneGeometry(1, 1, 32, 32)
  const material = new THREE.MeshBasicMaterial()
  const mesh = new THREE.Mesh(geometry, material)
  scene.add(mesh)

  /**
   * light
  */
  const directionalLight = new THREE.DirectionalLight('#ffffff', 4)
  directionalLight.position.set(3.5, 2, - 1.25)
  scene.add(directionalLight)

  /**
   * camera
  */
  const camera = new THREE.PerspectiveCamera(
    35,
    window.innerWidth / window.innerHeight,
    0.1,
    100
  )
  camera.position.set(6, 4, 8)

  /**
   * renderer
  */
  const renderer = new THREE.WebGLRenderer()
  renderer.setSize(window.innerWidth, window.innerHeight)
  document.body.appendChild(renderer.domElement)

  window.addEventListener('resize', () => {
    camera.aspect = window.innerWidth / window.innerHeight
    camera.updateProjectionMatrix()

    renderer.setSize(window.innerWidth, window.innerHeight) 
    renderer.setPixelRatio(Math.min(window.devicePixelRatio, 2))   
  }) 

  /**
   * axesHelper
  */
  const axesHelper = new THREE.AxesHelper(5)
  scene.add(axesHelper)

  /**
   * control
  */
  const controls = new OrbitControls(camera, renderer.domElement)
  controls.enableDamping = true

  /**
   * render
  */
  const tick = () => {

    controls.update()
    requestAnimationFrame(tick)
    renderer.render(scene, camera)
  }
  tick()

  /**
   * gui
  */
  const gui = new dat.GUI()
  </script>
基礎(chǔ)場(chǎng)景.png
  • Create our first shaders with RawShaderMaterial - 原始著色器材質(zhì)
    • Replace the meshBasicMaterial with RawShaderMaterial
    • use the vertexShader and fragmentShader properties to provide the shaders
    /**
     * test mesh
    */
    const geometry = new THREE.PlaneGeometry(1, 1, 32, 32)
    const material = new THREE.RawShaderMaterial({
      vertexShader: `
        uniform mat4 projectionMatrix;
        uniform mat4 viewMatrix;
        uniform mat4 modelMatrix;
    
        attribute vec3 position;
    
        void main() {
          gl_Position = projectionMatrix * viewMatrix * modelMatrix * vec4(position, 1.0);
        }
      `,
      fragmentShader: `
        precision mediump float;
    
        void main() {
          gl_FragColor = vec4(1.0, 0.0, 0.0, 1.0);
        }
      `
    })
    const mesh = new THREE.Mesh(geometry, material)
    scene.add(mesh)
    
    shaders.png
    • move the shader codes and import, 我們知道 import 是模塊化語(yǔ)法,通常用于導(dǎo)入模塊文件,但是這里我們需要的只是將導(dǎo)入內(nèi)容變成一個(gè)字符串并使用,所以需要支持解析 glsl語(yǔ)法
    • glsl的常用文檔:shaderific,khronos,The Book of Shaders
    文件結(jié)構(gòu).png
    import testVertexShader from './shaders/test/vertex.glsl'
    import testFragmentShader from './shaders/test/fragment.glsl'
    
    /**
     * test mesh
    */
    ...
    const material = new THREE.RawShaderMaterial({
      vertexShader: `
    
      `,
      fragmentShader: `
    
      `
    })
    ...
    
    error.png
    • We can use vite-plugin-glsl or vite-plugin-glslify, GLSLIFY is kind of the standard, but vite-plugin-glsl is easier to use and well maintained, so npm i vite-plugin-glsl
    // vite.config.js
    ...
    import glsl from 'vite-plugin-glsl'
    ...
    ...
    
    export default defineConfig({
      plugins: [
        vue (),
        glsl (),
      ],
      ...
    })
    
    import testVertexShader from './shaders/test/vertex.glsl'
    import testFragmentShader from './shaders/test/fragment.glsl'
    
    /**
     * test mesh
    */
    ...
    const material = new THREE.RawShaderMaterial({
      vertexShader: testVertexShader,
      fragmentShader: testFragmentShader,
      // wireframe: true,  // 部分屬性還可以繼續(xù)使用,但像類(lèi)似color這種還是需要著色器編寫(xiě)的
    })
    ...
    
    • 關(guān)于 vertex.glsl 的部分解釋
      • the clip space looks like a box
    // uniform 指所有線(xiàn)程統(tǒng)一的輸入值,只讀,對(duì)于每個(gè)頂點(diǎn)來(lái)說(shuō)都是相同的數(shù)據(jù)
    uniform mat4 projectionMatrix; // 投影矩陣,坐標(biāo)轉(zhuǎn)換
    uniform mat4 viewMatrix; // 視圖矩陣,相對(duì)于Camera的轉(zhuǎn)換
    uniform mat4 modelMatrix; // 模型矩陣,相對(duì)于Mesh的轉(zhuǎn)換(position,rotation,scale)
    
    // 緩沖幾何體的attributes,對(duì)于每個(gè)頂點(diǎn)來(lái)說(shuō)都有的屬性,可以獲取每個(gè)頂點(diǎn)的坐標(biāo)
    attribute vec3 position;
    
    // 自動(dòng)調(diào)用函數(shù),沒(méi)有返回值
    void main() {
      // gl_Position是一個(gè)內(nèi)置變量,描述世界坐標(biāo)系中的頂點(diǎn)坐標(biāo),返回一個(gè) vec4
      // 我們提供的坐標(biāo)位于 clip space,除了x,y,z以外還會(huì)提供第四個(gè)值 w,提供給 perspective
      gl_Position = projectionMatrix * viewMatrix * modelMatrix * vec4(position, 1.0);
    }
    
    • Separate each matrix part, 拆分 vertex.glsl 的部分,便于更好的控制
    uniform mat4 projectionMatrix; // 投影矩陣,坐標(biāo)轉(zhuǎn)換
    uniform mat4 viewMatrix; // 視圖矩陣,相對(duì)于Camera的轉(zhuǎn)換
    uniform mat4 modelMatrix; // 模型矩陣,相對(duì)于Mesh的轉(zhuǎn)換(position,rotation,scale)
    
    // 緩沖幾何體的attributes,對(duì)于每個(gè)頂點(diǎn)來(lái)說(shuō)都有的屬性,可以獲取每個(gè)頂點(diǎn)的坐標(biāo)
    attribute vec3 position;
    
    // 自動(dòng)調(diào)用函數(shù),沒(méi)有返回值
    void main() {
      // gl_Position是一個(gè)內(nèi)置變量,描述世界坐標(biāo)系中的頂點(diǎn)坐標(biāo),返回一個(gè)vec4
      // 我們提供的坐標(biāo)位于 clip space,除了x,y,z以外還會(huì)提供第四個(gè)值w,提供給perspective
      // gl_Position = projectionMatrix * viewMatrix * modelMatrix * vec4(position, 1.0);
    
      // 使用模型矩陣將位置屬性轉(zhuǎn)換為模型位置
      vec4 modelPosition = modelMatrix * vec4(position, 1.0); 
      modelPosition.z += sin(modelPosition.x * 10.0) * 0.1;  // 修改模型位置
      // 視圖位置
      vec4 viewPosition = viewMatrix * modelPosition; 
      // 投影位置
      vec4 projectionPosition = projectionMatrix * viewPosition;
    
      gl_Position = projectionPosition;
    }
    
    拆分后可以更細(xì)致的修改.png
    • 添加自定義的 attributevertex shader
    /**
     * test mesh
    */
    ...
    const count = geometry.attributes.position.count // 頂點(diǎn)數(shù)量
    const randoms = new Float32Array(count)  // 隨機(jī)數(shù)數(shù)組
    for(let i = 0; i < count; i++) {
      randoms[i] = Math.random()
    }
    // 添加attribute,每個(gè)頂點(diǎn)一個(gè)隨機(jī)值
    geometry.setAttribute('aRandom', new THREE.BufferAttribute(randoms, 1)) 
    ...
    ...
    
    ...
    // 緩沖幾何體的attributes,對(duì)于每個(gè)頂點(diǎn)來(lái)說(shuō)都有的屬性
    attribute vec3 position;
    attribute float aRandom;
    
    // 自動(dòng)調(diào)用函數(shù),沒(méi)有返回值
    void main() {
      ...
    
      // 使用模型矩陣將位置屬性轉(zhuǎn)換為模型位置
      vec4 modelPosition = modelMatrix * vec4(position, 1.0); 
      // modelPosition.z += sin(modelPosition.x * 10.0) * 0.1;
      modelPosition.z += aRandom * 0.1;
      ...
    }
    
    自定義attributes.png
    • 將數(shù)據(jù)從 vertex 發(fā)送到 fragment,在 fragment 中是不可以使用 attribute 的,所以這里將要使用上文提到的 varyings
    ...
    ...
    varying float vRandom;
    
    // 自動(dòng)調(diào)用函數(shù),沒(méi)有返回值
    void main() {
      ...
      ...
      vRandom = aRandom;
    }
    
    ...
    varying float vRandom;
    
    void main() {
      gl_FragColor = vec4(0.0, vRandom, vRandom, 1.0);
    }
    
    使用varyings從頂點(diǎn)傳遞數(shù)據(jù).png
    • 關(guān)于 fragment.glsl 的部分解釋
    // 精度 
    // highP可能會(huì)造成性能問(wèn)題,并且不適用于所有設(shè)備
    // lowP缺少精度可能不夠準(zhǔn)確
    // 默認(rèn)值為mediump, 必須提供
    precision mediump float;
    
    void main() {
      // 內(nèi)置變量,(r, g, b, a)
      // 僅在此修改 alpha 是不生效的,還需要結(jié)合 RawShaderMaterial 中的 transparent 屬性
      gl_FragColor = vec4(1.0, 0.0, 0.0, 1.0);
    }
    
    
    /**
     * test mesh
    */
    ...
    const material = new THREE.RawShaderMaterial({
      ...
      transparent: true,
    })
    ...
    
  • uniform

    • 假設(shè)CPU是一個(gè)管道,在CPU執(zhí)行任務(wù)時(shí),每一個(gè)任務(wù)就要排隊(duì)一次一個(gè)通過(guò)管道(串行),有的任務(wù)比別的大,那么就要花費(fèi)更長(zhǎng)的時(shí)間,為了提高任務(wù)的處理能力,現(xiàn)代計(jì)算機(jī)通常有多個(gè)處理器,這些管道被稱(chēng)為線(xiàn)程
    • 視頻、游戲等跟一般程序比起來(lái)需要高得多的處理能力,比如一個(gè)分辨率為800*600的老式屏幕,需要每一幀處理480000個(gè)像素,這對(duì)CPU來(lái)說(shuō)就是大問(wèn)題了,因此就有了 圖形處理器GPU (Graphic Processor Unit))- 用一大堆小的微處理器并行處理
    • GPU并行處理任務(wù)時(shí),每個(gè)線(xiàn)程只負(fù)責(zé)給完整圖像的一部分提供數(shù)據(jù),彼此之間不能進(jìn)行數(shù)據(jù)交換,但我們能從CPU給每個(gè)線(xiàn)程輸入數(shù)據(jù),但是所有這部分輸入數(shù)據(jù)必須統(tǒng)一,并且只讀,這些輸入數(shù)據(jù)就是 uniform
    /**
     * test mesh
    */
    ...
    ...
    const material = new THREE.RawShaderMaterial({
      ...
      uniforms: {
        uFrequency: {value: new THREE.Vector2(10, 5)}
      }
    })
    
    // uniform 指所有線(xiàn)程統(tǒng)一的輸入值,只讀
    ...
    uniform vec2 uFrequency;
    
    // 自動(dòng)調(diào)用函數(shù),沒(méi)有返回值
    void main() {
      ...
      ...
      // 使用模型矩陣將位置屬性轉(zhuǎn)換為模型位置
      vec4 modelPosition = modelMatrix * vec4(position, 1.0); 
      modelPosition.z += sin(modelPosition.x * uFrequency.x) * 0.1;
      modelPosition.z += sin(modelPosition.y * uFrequency.y) * 0.1;
      ...
      ...
    }
    
    uniform.png
    • 添加 gui 方便觀(guān)察值的變化,當(dāng)我們?cè)诓僮鲿r(shí),uniform的值就自動(dòng)更新了
    /**
     * gui
    */
    const gui = new dat.GUI()
    gui.add(material.uniforms.uFrequency.value, 'x').min(0).max(20).step(0.01).name('frequencyX')
    gui.add(material.uniforms.uFrequency.value, 'y').min(0).max(20).step(0.01).name('frequencyY')
    
    • 既然能夠動(dòng)態(tài)更新uniform,那就實(shí)現(xiàn)一下動(dòng)畫(huà)效果
    /**
     * test mesh
    */
    ...
    ...
    const material = new THREE.RawShaderMaterial({
      ...
      uniforms: {
        uFrequency: {value: new THREE.Vector2(10, 5)},
        uTime: {value: 0}
      }
    })
    ...
    ...
    
    /**
     * render
    */
    const clock = new THREE.Clock()
    const tick = () => {
      const elapsedTime = clock.getElapsedTime()
    
      // update material
      material.uniforms.uTime.value = elapsedTime
    
      controls.update()
      requestAnimationFrame(tick)
      renderer.render(scene, camera)
    }
    tick()
    
    ...
    uniform float uTime;
    
    // 自動(dòng)調(diào)用函數(shù),沒(méi)有返回值
    void main() {
      ...
      // 使用模型矩陣將位置屬性轉(zhuǎn)換為模型位置
      vec4 modelPosition = modelMatrix * vec4(position, 1.0); 
         
      modelPosition.z += sin(modelPosition.x * uFrequency.x - uTime) * 0.1;
      modelPosition.z += sin(modelPosition.y * uFrequency.y - uTime) * 0.1;
      ...
      ...
    }
    
    • 除此以外,uniform還可以被發(fā)送至 fragment,來(lái)改變一下顏色試試
    ...
    ...
    const material = new THREE.RawShaderMaterial({
      ...
      uniforms: {
        uFrequency: {value: new THREE.Vector2(10, 5)},
        uTime: {value: 0},
        uColor: {value: new THREE.Color('cyan')}
      }
    })
    
    const mesh = new THREE.Mesh(geometry, material)
    mesh.scale.y = 2 / 3
    scene.add(mesh)
    
    ...
    uniform vec3 uColor;
    
    void main() {
      // gl_FragColor = vec4(1.0, 0.0, 0.0, 1.0);
      gl_FragColor = vec4(uColor, 1.0);
      ...
    }
    
    通過(guò)uniform變更顏色.png
    • 再繼續(xù)改變 texture,這里需要將 texture 上的像素添加到 fragment shader中,我們使用 texture2D()
    // shaders.vue
    /**
     * texture
    */
    const textureLoader = new THREE.TextureLoader()
    const flagTexture = textureLoader.load('../public/imgs/sponge.jpg')
    
    
    /**
     * test mesh
    */
    ...
    const material = new THREE.RawShaderMaterial({
      ...
      uniforms: {
        uFrequency: {value: new THREE.Vector2(10, 5)},
        uTime: {value: 0},
        uColor: {value: new THREE.Color('cyan')},
        uTexture: {value: flagTexture}
      }
    })
    ...
    ...
    
    // vertex.glsl
    ...
    ...
    attribute vec2 uv;  // geometry的attributes屬性
    
    varying vec2 vUv;  // 從頂點(diǎn)傳入數(shù)據(jù)給fragment
    
    void main() {
      ...
      ...
      vUv = uv;
    }
    
    // fragment.glsl
    ...
    ...
    varying vec2 vUv;
    ...
    uniform sampler2D uTexture; // 紋理類(lèi)型
    
    void main() {
      ...
      ...
      vec4 textureColor = texture2D(uTexture, vUv); // texture
      gl_FragColor = textureColor;
    }
    
    texture.png
    • color variation 當(dāng)頂點(diǎn)很高,距離相機(jī)越近時(shí)增加亮度
    // vertex.glsl  這部分變更是為了能夠傳遞數(shù)據(jù)
    ...
    ...
    varying float vElevation;
    
    void main() {
      float elevation = sin(modelPosition.x * uFrequency.x - uTime) * 0.1;
      elevation += sin(modelPosition.y * uFrequency.y - uTime) * 0.1;
      modelPosition.z = elevation;
      ...
      ...
      vElevation = elevation
    }
    
    // fragment.glsl
    ...
    ...
    varying float vElevation;
    
    void main() {
      vec4 textureColor = texture2D(uTexture, vUv); // texture
      textureColor.rg *= vElevation * 2.0 + 0.8;
      gl_FragColor = textureColor;
    }
    
    color variation.png
  • 以上創(chuàng)建 shaders 我們用的是 RawShaderMaterial,在了解了RawShaderMaterial的用法后,現(xiàn)在我們使用一個(gè)相對(duì)更簡(jiǎn)潔的 ShaderMaterial
    • replace the material
    const material = new THREE.ShaderMaterial({
      ...
    })
    
    • 看一下報(bào)錯(cuò)的截圖,顯示我們正在重新定義這部分屬性,也就是說(shuō)這些屬性已經(jīng)存在了,所以我們刪除重新定義的部分,最終vertex文件代碼如下:


      修改material后的報(bào)錯(cuò).png
    // vertex.glsl
    uniform vec2 uFrequency;
    uniform float uTime;
    
    attribute float aRandom;
    
    varying vec2 vUv;
    varying float vElevation;
    
    void main() {
      vec4 modelPosition = modelMatrix * vec4(position, 1.0); 
    
      float elevation = sin(modelPosition.x * uFrequency.x - uTime) * 0.1;
      elevation += sin(modelPosition.y * uFrequency.y - uTime) * 0.1;
      modelPosition.z = elevation;
    
      vec4 viewPosition = viewMatrix * modelPosition; 
    
      vec4 projectionPosition = projectionMatrix * viewPosition;
    
      gl_Position = projectionPosition;
    
      vUv = uv;
      vElevation = elevation;
    }
    
?著作權(quán)歸作者所有,轉(zhuǎn)載或內(nèi)容合作請(qǐng)聯(lián)系作者
【社區(qū)內(nèi)容提示】社區(qū)部分內(nèi)容疑似由AI輔助生成,瀏覽時(shí)請(qǐng)結(jié)合常識(shí)與多方信息審慎甄別。
平臺(tái)聲明:文章內(nèi)容(如有圖片或視頻亦包括在內(nèi))由作者上傳并發(fā)布,文章內(nèi)容僅代表作者本人觀(guān)點(diǎn),簡(jiǎn)書(shū)系信息發(fā)布平臺(tái),僅提供信息存儲(chǔ)服務(wù)。

相關(guān)閱讀更多精彩內(nèi)容

友情鏈接更多精彩內(nèi)容