boolintersect(HitInfo& minHit, const Ray& ray)const{ // set minHit.t as the distance to the intersection point // return true/false if the ray hits or not float tx1 = (minp.x - ray.o.x) / ray.d.x; float ty1 = (minp.y - ray.o.y) / ray.d.y; float tz1 = (minp.z - ray.o.z) / ray.d.z;
float t1 = tx1; if (t1 < ty1) t1 = ty1; if (t1 < tz1) t1 = tz1; float t2 = tx2; if (t2 > ty2) t2 = ty2; if (t2 > tz2) t2 = tz2;
if (t1 > t2) returnfalse; if ((t1 < 0.0) && (t2 < 0.0)) returnfalse;
minHit.t = t1; returntrue; }
Ray Tracing in Triangle
Ray tracing and transforms
What if your object is transformed by affine transformation?
Two options:
Inverse transform a ray, compute the intersection, then transform the intersection
Transform the object and compute the intersection
Option 1:
Bring a ray into the object space (transform the origin and the direction)
Compute intersection in the object space
Transform the intersection (position and normal)
Intersection can be simplified, but it can be slower
Option 1 can avoid storing transformed objects
Option 2:
Transform the object into the world space
Compute intersection in the world space
No transformation is needed for intersection
Sometimes difficult to do intersection (e.g., sphere becomes an ellipsoid), but can be faster
Ray Marching
Ray marching is a class of rendering methods for 3D computer graphics where rays are traversed iteratively, effectively dividing each ray into smaller ray segments, sampling some function at each step.
Ray-Triangle Intersection (What we do in practice)
Given a ray and a triangle, the objective is to compute (t,α,β,γ)
Then the values of x,y and z can be found as follows:
x=∣∣a1a2a3b1b2b3c1c2c3∣∣∣∣d1d2d3b1b2b3c1c2c3∣∣,y=∣∣a1a2a3b1b2b3c1c2c3∣∣∣∣a1a2a3d1d2d3c1c2c3∣∣, and z=∣∣a1a2a3b1b2b3c1c2c3∣∣∣∣a1a2a3b1b2b3d1d2d3∣∣.
// loop over all of the point light sources for (int i = 0; i < globalScene.pointLightSources.size(); i++) { float3 l = globalScene.pointLightSources[i]->position - hit.P;
// the inverse-squared falloff constfloat falloff = length2(l);
// normalize the light direction l /= sqrtf(falloff);
// get the irradiance irradiance = float(std::max(0.0f, dot(hit.N, l)) / (4.0 * PI * falloff)) * globalScene.pointLightSources[i]->wattage; brdf = hit.material->Kd / PI;
if (hit.material->isTextured) { brdf *= hit.material->fetchTexture(hit.T); } L += irradiance * brdf; } return L; }
Shadow ray tracing
Return irradiance only if the light is visible
ignore intersections that are too close (set a threshold distance)
// loop over all of the point light sources for (int i = 0; i < globalScene.pointLightSources.size(); i++) { float3 l = globalScene.pointLightSources[i]->position - hit.P;
// the inverse-squared falloff constfloat falloff = length2(l);
// normalize the light direction l /= sqrtf(falloff);
// get the irradiance irradiance = float(std::max(0.0f, dot(hit.N, l)) / (4.0 * PI * falloff)) * globalScene.pointLightSources[i]->wattage; brdf = hit.material->BRDF(l, viewDir, hit.N);
if (hit.material->isTextured) { brdf *= hit.material->fetchTexture(hit.T); } L += irradiance * brdf; } return L; }
// loop over all of the point light sources for (int i = 0; i < globalScene.pointLightSources.size(); i++) { float3 l = globalScene.pointLightSources[i]->position - hit.P;
// the inverse-squared falloff constfloat falloff = length2(l);
// normalize the light direction l /= sqrtf(falloff);
// get the irradiance irradiance = float(std::max(0.0f, dot(hit.N, l)) / (4.0 * PI * falloff)) * globalScene.pointLightSources[i]->wattage; brdf = hit.material->BRDF(l, viewDir, hit.N);
if (hit.material->isTextured) { brdf *= hit.material->fetchTexture(hit.T); }
//shadow ray Ray shadowRay = Ray(hit.P,l*sqrtf(falloff)); HitInfo shadowHitInfo; //Set tMin to a tiny value to avoid self-intersection //Light length was sqrtf(falloff) so it is within the length of the light vector if(globalScene.intersect(shadowHitInfo, shadowRay, 0.000001f, sqrtf(falloff))) { continue; } L += irradiance * brdf; } return L; }
// If bug happens, check whether if etaI/etaO is reversed (which one is the material (ior) ?) if (exiting) { vecN = -normal; eta = ior; } else { vecN = normal; eta = 1.0f / ior; } cosThetaI = dot(incident, vecN); k = 1 - (eta * eta) * (1 - cosThetaI * cosThetaI); if(k < 0.0f) { returnfalse; // Do total internal reflection }
"If for a direction vector in the world (Dx, Dy, Dz), the corresponding (u,v) coordinate in the light probe image is (Dx*r,Dy*r) where r=(1/pi)*acos(Dz)/sqrt(Dx^2 + Dy^2)"
If you apply that formula you can convert a ray’s direction to uv coordinates where uv is in the range [-1.0, 1.0]