• VRML is:
  • A simple text language for describing 3-D shapes and interactive environments
  • VRML text files use a .wrl extension
  • You can view VRML files using a VRML browser:
  • A VRML helper-application
  • A VRML plug-in to an HTML browser
  • You can view VRML files from your local hard disk, or from the Internet

    [ temple.wrl ]
  • You can construct VRML files using:
  • A text editor
  • A world builder application
  • A shape generator
  • A modeler and format converter
  • Pros:
  • No new software to buy
  • Access to all VRML features
  • Detailed control of world efficiency
  • Cons:
  • Hard to author complex 3D shapes
  • Requires knowledge of VRML syntax
  • Pros:
  • Easy 3-D drawing user interface
  • Little need to learn VRML syntax
  • Cons:
  • May not support all VRML features
  • May not produce most efficient VRML
  • Pros:
  • Easy way to generate complex shapes
  • Fractal mountains, logos, etc.
  • Cons:
  • Only suitable for narrow set of shapes
  • Best used with other software
  • Pros:
  • Very powerful features available
  • Can make photo-realistic images too
  • Cons:
  • May not support all VRML features
  • Not designed for VRML
  • One-way path from modeler into VRML
  • Easy to make shapes that are too complex
  • VRML Repository maintains links to available software:

    http://www.sdsc.edu/vrml
  • VRML files contain:
  • The file header
  • Comments - notes to yourself
  • Nodes - nuggets of scene information
  • Fields - node attributes you can change
  • Values - attribute values
  • more. . .
  • #VRML V2.0 utf8
    # A Cylinder
    Shape {
        appearance Appearance {
            material Material { }
        }
        geometry Cylinder {
            height 2.0
            radius 1.5
        }
    }
    #VRML V2.0 utf8

  • #VRML: File contains VRML text
  • V2.0 : Text conforms to version 2.0 syntax
  • utf8 : Text uses UTF8 character set
  • utf8 is an international character set standard

  • utf8 stands for:
  • UCS (Universal Character Set) Transformation Format, 8-bit
  • Encodes 24,000+ characters for many languages
  • ASCII is a subset
  • # A Cylinder

  • Comments start with a number-sign (#) and extend to the end of the line
    Cylinder {
    }

  • Nodes describe shapes, lights, sounds, etc.
  • Every node has:
  • A node type (Shape, Cylinder, etc.)
  • A pair of curly-braces
  • Zero or more fields inside the curly-braces
  • Cylinder {
        height 2.0
        radius 1.5
    }

  • Fields describe node attributes
    height 2.0

  • Every field has:
  • A field name
  • A data type (float, int, etc.)
  • A default value
  • Fields are optional and given in any order
  • Default value used if field not given
  • The file header gives the version and encoding

  • Nodes describe scene content

  • Fields and values specify node attributes
  • Shapes are the building blocks of a VRML world

  • Primitive Shapes are standard building blocks:
  • Box
  • Cone
  • Cylinder
  • Sphere

  • [ prim.wrl ]
  • A Shape node builds a shape
  • appearance - color and texture
  • geometry - form, or structure

    Shape {
        appearance . . .
        geometry   . . .
    }
  • Shape geometry is built with geometry nodes:
    Box      { . . . }
    Cone     { . . . }
    Cylinder { . . . }
    Sphere   { . . . }
  • Geometry node fields control dimensions
  • Dimensions usually in meters, but can be anything
  • A Box geometry node builds a box

    Box {
        size 1.0 1.0 1.0
    }
  • A Cone geometry node builds an upright cone

    Cone {
        height 2.0
        bottomRadius 1.0
    }
  • A Cylinder geometry node builds an upright cylinder

    Cylinder {
        height 2.0
        radius 1.0
    }
  • A Sphere geometry node builds a sphere

    Sphere {
        radius 1.0
    }
    #VRML V2.0 utf8
    Shape {
        appearance Appearance {
            material Material { }
        }
        geometry Cylinder {
            height 2.0
            radius 1.5
        }
    }

    [ cylinder.wrl ]
  • Shapes are built centered in the world

  • A VRML file can contain multiple shapes

  • Shapes overlap when built at the same location
    #VRML V2.0 utf8
    Shape { . . . }
    Shape { . . . }
    . . .
    Shape { . . . }

    [ space.wrl ]
  • Shapes are built using a Shape node

  • Shape geometry is built using geometry nodes, such as Box, Cone, Cylinder, and Sphere
  • Text Shapes build text in a VRML world
  • Signs, notes, annotation, control panels
  • Text shapes are flat and have no thickness

  • You can choose the font family, style, size, etc.
  • A Text geometry node builds a text shape
  • text strings - the text to build
  • a font style - how to build the text
  • more . . .

    Shape {
        geometry Text {
            string    . . .
            fontStyle . . .
        }
    }
  • Text strings list the text to build

  • Each string is built on its own line or column

    Text {
        string [ "Text",
                 "Shape" ]
    }
  • A FontStyle node describes a font for a text shape
  • family - SERIF, SANS, or TYPEWRITER
  • style - BOLD, ITALIC, BOLDITALIC, or PLAIN,
  • more . . .
  • FontStyle {
        family  "SERIF"
        style   "BOLD"
    }
  • A FontStyle node describes a font for a text shape
  • size - character height
  • spacing - row/column spacing
  • FontStyle {
        size    1.0
        spacing 1.0
    }
  • Font styles also have:
  • justification - major & minor directions
    • FIRST, BEGIN, MIDDLE, END
  • direction - left/right, up/down
  • FontStyle {
        justify [ "BEGIN", "FIRST" ]
        horizontal  TRUE
        leftToRight TRUE
        topToBottom TRUE
    }
    Shape {
        appearance Appearance { . . . }
        geometry Text {
            string [ "Text",
                     "Shape" ]
            fontStyle FontStyle {
                style  "BOLD"
            }
        }
    }

    [ text.wrl ]
  • Text geometry is built using a Text geometry node in a Shape node

  • Font style is controlled using a FontStyle node
  • By default, all shapes are built at the center of the world

  • A transform enables you to
  • Position shapes
  • Rotate shapes
  • Scale shapes

  • [ towers.wrl ]
  • A VRML file builds components for a world

  • A file's world components are built in the file's world coordinate system

  • By default, all shapes are built at the origin of the world coordinate system
  • A transform creates a coordinate system that is
  • Positioned
  • Rotated
  • Scaled
  • relative to a parent coordinate system

  • Shapes built in the new coordinate system are positioned, rotated, and scaled along with it
  • The Transform group node creates a group with its own coordinate system
  • children - shapes to build
  • translation - position
  • rotation - orientation
  • scale - size
  • Transform {
        translation . . .
        rotation    . . .
        scale       . . .
        children  [ . . . ]
    }
  • The children field includes a list of one or more nodes
    Transform {
        . . .
        children [
            Shape { . . . }
            Transform { . . . }
            . . .
        ]
    }
  • Translation positions a coordinate system in X, Y, and Z
    Transform {
        #           X   Y   Z
        translation 2.0 0.0 0.0
        children [ . . . ]
    }
  • Rotation orients a coordinate system about a rotation axis by a rotation angle
  • Angles are measured in radians
  • Transform {
        #        X   Y   Z    Angle
        rotation 0.0 0.0 1.0  0.52
        children [ . . . ]
    }
  • A rotation axis defines a pole to rotate around
  • Like the Earth's North-South pole
  • Typical rotations are about the X, Y, or Z axes:
    Rotate about Axis
    X-Axis 1.0 0.0 0.0
    Y-Axis 0.0 1.0 0.0
    Z-Axis 0.0 0.0 1.0
  • To help remember positive and negative rotation directions:
  • Open your hand
  • Stick out your thumb
  • Aim your thumb in an axis positive direction
  • Curl your fingers around the axis

  • The curl direction is a positive rotation
  • Scale grows or shrinks a coordinate system by a scaling factor in X, Y, and Z
    Transform {
        #     X   Y   Z
        scale 0.5 0.5 0.5
        children [ . . . ]
    }
  • Scale, Rotate, and Translate a coordinate system, one after the other
    Transform {
        translation 2.0 0.0 0.0
        rotation 0.0 0.0 1.0  0.52
        scale 0.5 0.5 0.5
        children [ . . . ]
    }
    Transform {
        translation 4.0 0.0 0.0
        rotation    0.0 1.0 0.0  0.785
        scale       0.5 0.5 0.5
        children [ . . .  ]
    }
    [ arch.wrl ]
    [ arches.wrl ]
  • All shapes are built in a coordinate system

  • The Transform node creates a new coordinate system relative to its parent

  • Transform node fields do
  • translation
  • rotation
  • scale
  • The primitive shapes have a default emissive (glowing) white appearance

  • You can control a shape's
  • Shading color
  • Glow color
  • Transparency
  • more . . .

  • [ colors.wrl ]
  • Recall that Shape nodes describe:
  • appearance - color and texture
  • geometry - form, or structure

    Shape {
        appearance . . .
        geometry   . . .
    }
  • An Appearance node describes overall shape appearance
  • material properties - color, transparency, etc.
  • more . . .

    Shape {
        appearance Appearance {
            material . . .
        }
        geometry . . .
    }
  • A Material node controls shape material attributes
  • diffuse color - main shading color
  • emissive color - glowing color
  • transparency - opaque or not
  • more . . .

    Material {
        diffuseColor  . . .
        emissiveColor . . .
        transparency  . . .
    }
  • Colors specify:
  • A mixture of red, green, and blue light
  • Values between 0.0 (none) and 1.0 (lots)
  • Color Red Green Blue
    White 1.0 1.0 1.0
    Black 0.0 0.0 0.0
    Yellow 1.0 1.0 0.0
    Magenta 1.0 0.0 1.0
    Brown 0.5 0.2 0.0
    Shape {
        appearance Appearance {
            material Material {
                diffuseColor 1.0 1.0 1.0
            }
        }
        geometry . . .
    }

    [ slabs.wrl ]
  • The Appearance node controls overall shape appearance

  • The Material node controls overall material properties including:
  • Shading color
  • Glow color
  • Transparency
  • You can group shapes to compose complex shapes
  • VRML has several grouping nodes, including:
    Group       { . . . }
    Switch      { . . . }
    Transform   { . . . }
    Billboard   { . . . }
    Anchor      { . . . }
    Inline      { . . . }
    
  • The Group node creates a basic group
  • Every child node in the group is displayed
  • Group {
        children [ . . . ]
    }
  • The Switch group node creates a switched group
  • Only one child node in the group is displayed
  • You select which child
  • Switch {
        whichChoice 0
        choice [ . . . ]
    }
  • The Transform group node creates a group with its own coordinate system
  • Every child node in the group is displayed
  • Transform {
        translation . . .
        rotation    . . .
        scale       . . .
        children [ . . . ]
    }
  • The Billboard group node creates a group with a special coordinate system
  • Every child node in the group is displayed
  • Coordinate system is turned to face viewer
  • Billboard {
        axisOfRotation . . .
        children [ . . . ]
    }
  • A rotation axis defines a pole to rotate round
  • Similar to a Transform node's rotation field, but no angle (auto computed)
  • Group {
        children [
            Billboard {
                axisOfRotation 0.0 1.0 0.0
                children [ ... ]
            }
            Transform { . . . }
        ]
    }

    [ robobill.wrl ]
  • An Anchor node creates a group that acts as a clickable anchor
  • Every child node in the group is displayed
  • Clicking any child follows a URL
  • A description names the anchor
  • Anchor {
        url "stairway.wrl"
        description "Twisty Stairs"
        children [ . . . ]
    }
    [ anchor.wrl ]
    [ stairway.wrl ]
  • An Inline node creates a special group from another VRML file's contents
  • Children read from file selected by a URL
  • Every child node in group is displayed
  • Inline {
        url "table.wrl"
    }
    Inline { url "table.wrl" }
    . . .
    Transform {
        translation . . .
        children [
            Inline { url "chair.wrl" }
        ]
    }

    [ table.wrl, chair.wrl, dinette.wrl ]
  • The Group node creates a basic group

  • The Switch node creates a group with 1 choice used

  • The Transform node creates a group with a new coordinate system
  • The Billboard node creates a group with a coordinate system that rotates to face the viewer

  • The Anchor node creates a clickable group
  • Clicking any child in the group loads a URL
  • The Inline node creates a special group loaded from another VRML file
  • If several shapes have the same geometry or appearance, you must use multiple duplicate nodes, one for each use

  • Instead, define a name for the first occurrence of a node

  • Later, use that name to share the same node in a new context
  • The DEF syntax gives a name to a node

    DEF RedColor Material {
        diffuseColor 1.0 0.0 0.0
    }

  • You can name any node
  • Names can be most any sequence of letters and numbers
  • Names must be unique within a file
  • The USE syntax uses a previously named node

    Appearance {
        material USE RedColor
    }

  • A re-use of a named node is called an instance
  • A named node can have any number of instances
  • Each instance shares the same node description
  • Naming and using nodes:
  • Saves typing
  • Reduces file size
  • Enables rapid changes to shapes with the same attributes
  • Speeds browser processing
  • Names are also necessary for animation...

    [ dinette.wrl ]
  • DEF names a node

  • USE uses a named node
  • Nodes like Billboard and Anchor have built-in behavior

  • You can create your own behaviors to make shapes move, rotate, scale, blink, and more
  • Almost every node can be a component in an animation circuit
  • Nodes act like virtual electronic parts
  • Nodes can send and receive events
  • Wired routes connect nodes together
  • An event is a message sent between nodes
  • A data value (such as a translation)
  • A time stamp (when did the event get sent)
  • To spin a shape:
  • Connect a node that sends rotation events to a Transform node's rotation field
  • To blink a shape:
  • Connect a node that sends color events to a Material node's diffuseColor field
  • To set up an animation circuit, you need:
  • A node to send events
  • The node must be named with DEF
  • A node to receive events
  • The node must be named with DEF
  • A route connecting them
  • Every node has fields, inputs, and outputs:
  • field: A stored value
  • eventIn: An input
  • eventOut: An output
  • An exposedField is a short-hand for a field, eventIn, and eventOut
  • Some Transform node inputs:
  • set_translation
  • set_rotation
  • set_scale
  • Some Material node inputs:
  • set_diffuseColor
  • set_emissiveColor
  • set_transparency
  • Some TouchSensor node outputs:
  • isOver
  • isActive
  • touchTime
  • An OrientationInterpolator node output:
  • value_changed
  • A PositionInterpolator node output:
  • value_changed
  • A ROUTE statement connects two nodes together using
  • The sender's node name and eventOut name
  • The receiver's node name and eventIn name
  • ROUTE MySender.rotation_changed
       TO MyReceiver.set_rotation

  • Event data types must match!
    SFBool SFRotation / MFRotation
    SFColor / MFColor SFString / MFString
    SFFloat / MFFloat SFTime
    SFImage SFVec2f / MFVec2f
    SFInt32 / MFInt32 SFVec3f / MFVec3f
    SFNode / MFNode
  • Most nodes have exposedFields

  • If the exposed field name is xxx, then:
  • set_xxx is an eventIn to set the field
  • xxx_changed is an eventOut that sends when the field changes
  • The Transform node has:
  • rotation field
  • set_rotation eventIn
  • rotation_changed eventOut
  • DEF RotateMe Transform {
        rotation 0.0 1.0 0.0 0.0
        children [ . . . ]
    }
    DEF Rotator OrientationInterpolator { . . .  }
    
    ROUTE Rotator.value_changed
       TO RotateMe.set_rotation

    [ colors.wrl ]
  • You can have fan-out
  • Multiple routes out of the same sender
  • You can have fan-in
  • Multiple routes into the same receiver
  • Connect senders to receivers using routes

  • eventIns are inputs, and eventOuts are outputs

  • A route names the sender.eventOut, and the receiver.eventIn
  • Data types must match
  • You can have multiple routes into or out of a node
  • An animation changes something over time:
  • position - a car driving
  • orientation - an airplane banking
  • color - seasons changing
  • Animation requires control over time:
  • When to start and stop
  • How fast to go

  • [ floater.wrl ]
  • A TimeSensor node is similar to a stop watch
  • You control the start and stop time
  • The sensor generates time events while it is running

  • To animate, route time events into other nodes
  • A TimeSensor node generates absolute and fractional time events

  • Absolute time events give the wall-clock time
  • Absolute time is measured in seconds since 12:00am January 1, 1970!
  • Useful for triggering events at specific dates and times
  • Fractional time events give a number from 0.0 to 1.0
  • Values cycle from 0.0 to 1.0, then repeat

  • The number of seconds between 0.0 and 1.0 is controlled by the cycle interval
  • The sensor can loop forever, or run once and stop
  • A TimeSensor node generates events based upon time
  • start and stop time - when to run
  • cycle interval time - how long a cycle is
  • looping - whether or not to repeat cycles

    TimeSensor {
        cycleInterval 1.0
        loop FALSE
        startTime 0.0
        stopTime 0.0
    }
  • Create continuously running timers:
    loop TRUE
    stopTime <= startTime
  • Run one cycle then stop
    loop FALSE
    stopTime <= startTime
  • Run until stopped, or after cycle is over
    loop TRUE or FALSE
    stopTime > startTime
  • The set_startTime input event:
  • Sets when the timer should start
  • The set_stopTime input event:
  • Sets when the timer should stop
  • The first cycle starts at the start time

  • The cycle interval is the length (in seconds) of the cycle

  • Each cycle varies a fraction from 0.0 to 1.0

  • If loop is FALSE, there is only one cycle, otherwise the timer may cycle forever
  • The isActive output event:
  • Outputs TRUE at timer start
  • Outputs FALSE at timer stop
  • The time output event:
  • Outputs the absolute time
  • The fraction_changed output event:
  • Outputs values from 0.0 to 1.0 during a cycle
  • Resets to 0.0 at the start of each cycle
  • DEF Monolith1Timer TimeSensor {
        cycleInterval 4.0
        loop FALSE
        startTime 1.0
        stopTime  0.0
    }
    ROUTE Monolith1Touch.touchTime
       TO Monolith1Timer.set_startTime
    
    ROUTE Monolith1Timer.fraction_changed
       TO Monolith1Light.set_intensity

    [ monolith.wrl ]
  • To animate the position of a shape you provide:
  • A list of key positions for a movement path
  • A time at which to be at each position
  • An interpolator node converts an input time to an output position
  • When a time is in between two key positions, the interpolator computes an intermediate position
  • Each key position along a path has:
  • A key value (such as a position)
  • A key fractional time
  • Interpolation fills in values between your key values:
    Time Position
    0.0 0.0 0.0 0.0
    0.1 0.4 0.1 0.0
    0.2 0.8 0.2 0.0
    . . . . . .
    0.5 4.0 1.0 0.0
    . . . . . .
  • A PositionInterpolator node describes a position path
  • keys - key fractional times
  • key values - key positions

    PositionInterpolator {
        key [ 0.0, . . . ]
        keyValue [ 0.0 0.0 0.0, . . . ]
    }
  • Route into a Transform node's set_translation input
  • The set_fraction input:
  • Sets the current fractional time along the key path
  • The value_changed output:
  • Outputs the position along the path each time the fraction is set
  • DEF Mover PositionInterpolator {
        key  [ 0.0, . . . ]
        keyValue [ 0.0 0.0 0.0, . . .]
    }
    ROUTE Clock.fraction_changed
       TO Mover.set_fraction
    
    ROUTE Mover.value_changed
       TO Movee.set_translation

    [ floater.wrl ]
  • To animate shape orientation, use an OrientationInterpolator

  • To animate shape color, use a ColorInterpolator

  • To animate shape transparency, use a ScalarInterpolator

  • To animate shape scale, use a trick and use a PositionInterpolator
  • A OrientationInterpolator node describes an orientation path
  • keys - key fractions
  • key values - key rotations (axis and angle)

    OrientationInterpolator {
        key [ 0.0, . . . ]
        keyValue [ 0.0 1.0 0.0 0.0, . . . ]
    }
  • Route into a Transform node's set_rotation input
  • ColorInterpolator node describes a color path
  • keys - key fractions
  • values - key colors (red, green, blue)

    ColorInterpolator {
        key [ 0.0, . . . ]
        keyValue [ 1.0 1.0 0.0, . . . ]
    }
  • Route into a Material node's set_diffuseColor or set_emissiveColor inputs
  • ScalarInterpolator node describes a scalar path
  • keys - key fractions
  • values - key scalars (used for anything)

    ScalarInterpolator {
        key [ 0.0, . . . ]
        keyValue [ 4.5, . . . ]
    }
  • Route into a Material node's set_transparency input
  • A PositionInterpolator node describes a position or scale path
  • keys - key fractional times
  • key values - key positions (or scales)

    PositionInterpolator {
        key [ 0.0, . . . ]
        keyValue [ 0.0 0.0 0.0, . . . ]
    }
  • Route into a Transform node's set_scale input

    [ squisher.wrl ]
  • The TimeSensor node's fields control
  • Timer start and stop times
  • The cycle interval
  • Whether the timer loops or not
  • The sensor outputs
  • true/false on isActive at start and stop
  • absolute time on time while running
  • fractional time on fraction_changed while running
  • Interpolators use key times and values and compute intermediate values

  • All interpolators have:
  • a set_fraction input to set the fractional time
  • a value_changed output to send new values
  • The PositionInterpolator node converts times to positions (or scales)

  • The OrientationInterpolator node converts times to rotations

  • The ColorInterpolator node converts times to colors

  • The ScalarInterpolator node converts times to scalars (such as transparencies)
  • You can sense when the viewer's cursor:
  • Is over a shape
  • Has touched a shape
  • Is dragging atop a shape
  • You can trigger animations on a viewer's touch

  • You can enable the viewer to move and rotate shapes
  • There are four main action sensor types:
  • TouchSensor senses touch
  • SphereSensor senses drags
  • CylinderSensor senses drags
  • PlaneSensor senses drags
  • The Anchor node is a special-purpose action sensor with a built-in response
  • All action sensors sense all shapes in the same group

  • Sensors trigger when the viewer's cursor touches a sensed shape
  • A TouchSensor node senses the cursor's touch
  • isOver - send true/false when cursor over/not over
  • isActive - send true/false when mouse button pressed/released
  • touchTime - send time when mouse button released
  • Transform {
        children [
            . . .
            DEF Touched TouchSensor { }
        ]
    }

    [ colors.wrl ]
  • A SphereSensor node senses a cursor drag and generates rotations as if rotating a ball
  • isActive - sends true/false when mouse button pressed/released
  • rotation_changed - sends rotation during a drag
  • Transform {
        children [
            DEF RotateMe Transform { . . . }
            DEF Rotator  SphereSensor { }
        ]
    }
    ROUTE Rotator.rotation_changed
       TO RotateMe.set_rotation
  • A CylinderSensor node senses a cursor drag and generates rotations as if rotating a cylinder
  • isActive - sends true/false when mouse button pressed/released
  • rotation_changed - sends rotation during a drag
  • Transform {
        children [
            DEF RotateMe Transform { . . . }
            DEF Rotator  CylinderSensor { }
        ]
    }
    ROUTE Rotator.rotation_changed
       TO RotateMe.set_rotation
  • A PlaneSensor node senses a cursor drag and generates translations as if sliding on a plane
  • isActive - sends true/false when mouse button pressed/released
  • translation_changed - sends translations during a drag
  • Transform {
        children [
            DEF MoveMe Transform { . . . }
            DEF Mover  PlaneSensor { }
        ]
    }
    ROUTE Mover.translation_changed
       TO MoveMe.set_translation
  • Multiple sensors can sense the same shape but. . .
  • If sensors are in the same group:
  • They all respond
  • If sensors are at different depths in the hierarchy:
  • The deepest sensor responds
  • The other sensors do not respond

  • [ lamp.wrl ]
  • Action sensors sense when the viewer's cursor:
  • is over a shape
  • has touched a shape
  • is dragging atop a shape
  • Sensors convert viewer actions into events to
  • Start and stop animations
  • Orient shapes
  • Position shapes
  • Use Shape nodes to build bars
  • Use Box nodes for bar geometry
  • Use Text and FontStyle nodes for label geometry
  • Use Appearance and Material nodes to select bar color
  • Use Transform nodes to position bars and labels
  • Use Shape node et al as usual
  • Use Transform nodes to position and orient shapes
  • Use CylinderSensor nodes to turn shapes about
  • Use Shape node et al as usual
  • Vary transparency and diffuseColor field values with voxel data
  • Use Shape node et al as usual
  • Use TimeSensor nodes to control animation
  • Use PositionInterpolator nodes to move plungers
  • Use OrientationInterpolator nodes to spin gadget
  • Use PositionInterpolator nodes to squish spheres
  • Use Shape node et al as usual
  • Use TimeSensor nodes to control animation
  • Use PositionInterpolator nodes to move shapes
  • Use ColorInterpolator nodes to color shapes
  • Bar plot task

    Gravity animator task
  • Complex shapes are hard to build with primitive shapes
  • Terrain
  • Animals
  • Plants
  • Machinery
  • Instead, build shapes out of atomic components:
  • Points, lines, and faces
  • Shape building is like a 3-D connect-the-dots game:
  • Place dots at 3-D locations
  • Connect-the-dots to form shapes
  • A coordinate specifies a 3-D dot location
  • Measured relative to a coordinate system origin
  • A geometry node specifies how to connect the dots
  • A Coordinate node contains a list of coordinates for use in building a shape
    Coordinate {
        point [
    #       X   Y   Z
            2.0 1.0 3.0,
            4.0 2.5 5.3,
            . . .
        ]
    }
  • Build shapes using geometry nodes:
  • PointSet
  • IndexedLineSet
  • IndexedFaceSet
  • For all three nodes, use a Coordinate node as the value of the coord field
  • A PointSet geometry node creates geometry out of points
  • One point (a dot) is placed at each coordinate
  • PointSet {
        coord Coordinate {
            point [  . . .  ]
        }
    }

    [ ptplot.wrl ]
  • An IndexedLineSet geometry node creates geometry out of lines
  • A straight line is drawn between pairs of selected coordinates
  • IndexedLineSet {
        coord Coordinate {
            point [  . . .  ]
        }
        coordIndex [ . . . ]
    }
  • Each coordinate in a Coordinate node is implicitly numbered
  • Index 0 is the first coordinate
  • Index 1 is the second coordinate, etc.
  • To build a line shape
  • Make a list of coordinates, using their indexes
  • Use an IndexedLineSet node to draw a line from coordinate to coordinate in the list
  • 1, 0, 3, -1, . . .
  • 1, 0, Draw from 1 to 0
  • 0, 3, Draw from 0 to 3
  • -1, End line sequence
  • List coordinate indexes in the coordIndex field of the IndexedLineSet node

    [ lnplot.wrl ]
  • An IndexedFaceSet geometry node creates geometry out of faces
  • A flat facet (polygon) is drawn using an outline specified by coordinates
  • IndexedFaceSet {
        coord Coordinate {
            point [  . . .  ]
        }
        coordIndex [ . . . ]
    }
  • To build a face shape
  • Make a list of coordinates, using their indexes
  • Use an IndexedFaceSet node to draw a face outlined by the coordinates in the list
  • List coordinate indexes in the coordIndex field of the IndexedFaceSet node

    [ lightng.wrl ]
  • A CoordinateInterpolator node describes a coordinate path
  • keys - key fractions
  • values - key coordinate lists (X,Y,Z lists)

    CoordinateInterpolator {
        key [ 0.0, . . . ]
        keyValue [ 0.0 1.0 0.0, . . . ]
    }
  • Shapes are built by connecting together coordinates

  • Coordinates are listed in a Coordinate node

  • Coordinates are implicitly numbers starting at 0

  • Coordinate index lists give the order in which to use coordinates
  • The PointSet node draws a dot at every coordinate
  • The coord field value is a Coordinate node
  • The IndexedLineSet node draws lines between coordinates
  • The coord field value is a Coordinate node
  • The coordIndex field value is a list of coordinate indexes
  • The IndexedFaceSet node draws faces outlined by coordinates
  • The coord field value is a Coordinate node
  • The coordIndex field value is a list of coordinate indexes
  • The CoordinateInterpolator node converts times to coordinates
  • Building terrains is very common
  • Hills, valleys, mountains
  • Other tricky uses...
  • You can build a terrain using an IndexedFaceSet node

  • You can build terrains more efficiently using an ElevationGrid node
  • An ElevationGrid geometry node creates terrains
  • X & Z dimensions - grid size
  • X & Z spacings - row and column distances
  • more . . .
  • ElevationGrid {
        xDimension 3
        zDimension 2
        xSpacing   1.0
        zSpacing   1.0
        . . .
    }
  • An ElevationGrid geometry node creates terrains
  • height - elevations at grid points
  • ElevationGrid {
        . . .
        height [
            0.0, -0.5, 0.0,
            0.2,  4.0, 0.0
        ]
    }
    Shape {
        . . .
        geometry ElevationGrid {
            xDimension 9
            zDimension 9
            xSpacing   1.0
            zSpacing   1.0
            height [ . . . ]
        }
    }

    [ mount.wrl ]
  • An ElevationGrid node efficiently creates a terrain

  • Grid size is specified in the xDimension and zDimension fields

  • Grid spacing is specified in the xSpacing and zSpacing field

  • Elevations at each grid point are specified in the height field
  • Extruded shapes are very common
  • Tubes, pipes, bars, vases, donuts
  • Other tricky uses...
  • You can build extruded shapes using an IndexedFaceSet node

  • You can build extruded shapes more easily and efficiently using an Extrusion node

    [ slide.wrl ]

    [ donut.wrl ]
  • Extruded shapes are described by
  • A 2-D cross-section
  • A 3-D spine along which to sweep the cross-section
  • Extruded shapes are like long bubbles created with a bubble wand
  • The bubble wand's outline is the cross-section
  • The path along which you swing the wand is the spine
  • An Extrusion geometry node creates extruded geometry
  • 2-D cross-section - cross-section
  • 3-D spine - sweep path
  • more . . .
  • Extrusion {
        crossSection [ . . . ]
        spine [ . . .  ]
        . . .
    }
  • You can scale the cross-section along the spine
  • Vases, musical instruments
  • Surfaces of revolution
  • You can rotate the cross-section along the spine
  • Twisting ribbons
  • An Extrusion geometry node creates geometry using
  • scales - cross-section scaling per spine point
  • rotations - cross-section rotation per spine point
  • Extrusion {
        . . .
        scale [ . . . ]
        orientation [ . . . ]
    }
  • An Extrusion node efficiently creates extruded shapes

  • The crossSection field specifies the cross-section

  • The spine field specifies the sweep path

  • The scale and orientation fields specify scaling and rotation at each spine point
  • The Material node gives an entire shape the same color

  • You can provide colors for parts of a shape using a Color node
  • Flames, mountains, radiosity lighting

  • [ cmount.wrl ]
  • A Color node contains a list of RGB values
    Color {
        color [ 1.0 0.0 0.0, . . . ]
    }

  • Used as the color field value of IndexedFaceSet, IndexedLineSet, PointSet or ElevationGrid nodes
  • Colors in the Color node override those in the Material node

  • You can bind colors
  • To each point, line, or face
  • To each coordinate in a line, or face
  • A PointSet geometry node creates geometry out of points
  • color - provides a list of colors
  • Always binds one color to each point, in order
  • PointSet {
        coord Coordinate { . . . }
        color Color { . . . }
    }

    [ scatter.wrl ]
  • An IndexedLineSet geometry node creates geometry out of lines
  • color - a list of colors
  • color indexes - selects colors from list (just like selecting coordinates)
  • color per vertex - control color binding

    IndexedLineSet {
        coord Coordinate { . . . }
        coordIndex [ . . . ]
        color Color { . . . }
        colorIndex [ . . . ]
        colorPerVertex TRUE
    }
  • The colorPerVertex field controls how color indexes are used
  • FALSE: one color index to each line (ending at -1 coordinate indexes)

  • TRUE: one color index to each coordinate index of each line (including -1 coordinate indexes)

  • [ burst.wrl ]
  • An IndexedFaceSet geometry node creates geometry out of faces
  • color - a list of colors
  • color indexes - selects colors from list (just like selecting coordinates)
  • color per vertex - control color binding

    IndexedFaceSet {
        coord Coordinate { . . . }
        coordIndex [ . . . ]
        color Color { . . . }
        colorIndex [ . . . ]
        colorPerVertex TRUE
    }
  • The colorPerVertex field controls how color indexes are used (similar to line sets)
  • FALSE: one color index to each face (ending at -1 coordinate indexes)

  • TRUE: one color index to each coordinate index of each face (including -1 coordinate indexes)

  • [ flames.wrl ]
  • An ElevationGrid geometry node creates terrains
  • color - a list of colors
  • color per vertex - control color binding

    ElevationGrid {
        height [ . . . ]
        color Color { . . . }
        colorPerVertex TRUE
    }
  • The ElevationGrid node does not use color indexes
  • The colorPerVertex field controls how color indexes are used (similar to line and face sets)
  • FALSE: one color to each grid square

  • TRUE: one color to each height for each grid square

  • [ cmount.wrl ]
  • The Color node lists colors to use for parts of a shape
  • Used as the value of the color field
  • Color indexes select colors to use
  • Colors override Material node
  • The colorPerVertex field selects color per line/face/grid square or color per coordinate
  • Use IndexedFaceSet node to build surface
  • Use transparency field of Material node to make surfaces semi-transparent
  • Use ElevationGrid node to build surface
  • Use Color node to provide colors for each elevation grid point
  • Use Extrusion node to build particle paths
  • Use spine field to give 3D coordinates along particle paths
  • Use crossSection field to describe a 2D box for a ribbon
  • Eelvation grid task

    Extrusion task

    Terrain task

    Ribbon task
  • You can model every tiny texture detail of a world using a vast number of colored faces
  • Takes a long time to write the VRML
  • Takes a long time to draw
  • Use a trick instead
  • Take a picture of the real thing
  • Paste that picture on the shape, like sticking on a decal
  • This technique is called Texture Mapping

    [ can.wrl ]
  • Image textures
  • A single image from a file
  • JPEG, GIF, or PNG format
  • Pixel textures
  • A single image, given in the VRML file itself
  • Movie textures
  • A movie from a file
  • MPEG format
  • An Appearance node describes overall shape appearance
  • texture - texture source

    Appearance {
        material Material { . . . }
        texture ImageTexture { . . . }
    }
  • Color textures override the color in a Material node

  • Grayscale textures multiply with the Material node color
  • Good for colorizing grayscale textures
  • An ImageTexture node selects a texture image for texture mapping
  • url - texture image file URL

    ImageTexture {
        url "wood.jpg"
    }
  • A PixelTexture node specifies texture image pixels for texture mapping
  • image pixels - texture image pixels
  • image data - width, height, bytes/pixel, pixel values

    PixelTexture {
        image 2 1 3 0xFFFF00 0xFF0000
    }
  • A MovieTexture node selects a texture movie for texture mapping
  • url - texture movie file URL
  • When to play the movie, and how quickly (like a TimeSensor node)

    MovieTexture {
        url "movie.mpg"
        loop TRUE
        speed 1.0
    }
  • Texture images can include color and transparency values for each pixel

  • Pixel transparencies enable you to make parts of a shape transparent
  • Windows, grillwork, holes
  • Trees, clouds
  • A texture is like a decal pasted to a shape

  • Specify the texture using an ImageTexture, PixelTexture, or MovieTexture node in an Appearance node

  • Color textures override material, grayscale textures multiply

  • Textures with transparency create holes
  • By default, you have one light in the scene, attached to your head

  • For more realism, you can add multiple lights
  • Suns, light bulbs, candles
  • Flashlights, spotlights, firelight
  • Lights can be positioned, oriented, and colored

  • Lights do not cast shadows
  • Theer are three types of VRML lights
  • Point lights - radiate in all directions from a point

  • Directional lights - aim in one direction from infinitely far away

  • Spot lights - aim in one direction from a point, radiating in a cone
  • All lights have several common fields:
  • on - turn it on or off
  • intensity - control brightness
  • ambientIntensity - control ambient effect
  • color - select color
  • Point lights and spot lights also have:
  • location - position
  • radius - maximum lighting distance
  • attenuation - drop off with distance
  • Directional lights and spot lights also have
  • direction - aim direction
  • A PointLight node illuminates radially from a point

    PointLight {
        location 0.0 0.0 0.0
        intensity 1.0
        color 1.0 1.0 1.0
    }

  • A DirectionalLight node illuminates in one direction from infinitely far away

    DirectionalLight {
        direction 1.0 0.0 0.0
        intensity 1.0
        color 1.0 1.0 1.0
    }

  • A SpotLight node illuminates from a point, in one direction, within a cone

    SpotLight {
        location  0.0 0.0 0.0
        direction 1.0 0.0 0.0
        intensity 1.0
        color 1.0 1.0 1.0
    }

  • The maximum width of a spot light's cone is controlled by the cutOffAngle field

  • An inner cone region with constant brightness is controlled by the beamWidth field

    SpotLight {
        . . .
        cutOffAngle 0.785
        beamWidth   1.571
    }

    [ temple.wrl ]
  • There are three types of lights: point, directional, and spot

  • All lights have an on/off, intensity, ambient effect, and color

  • Point and spot lights have a location, radius, and attenuation

  • Directional and spot lights have a direction
  • Shapes form the foreground of your scene

  • You can add a background to provide context

  • Backgrounds describe:
  • Sky and ground colors
  • Panorama images of mountains, cities, etc
  • Backgrounds are faster to draw than if you used shapes to build them
  • A background creates three special shapes:
  • A sky sphere
  • A ground sphere inside the sky sphere
  • A panorama box inside the ground sphere
  • The sky and ground spheres are shaded with a color gradient

  • The panorama box is texture mapped with six images
  • Transparent parts of the ground sphere reveal the sky sphere

  • Transparent parts of the panorama box reveal the ground and sky spheres

  • The viewer can look up, down, and side-to-side to see different parts of the background

  • The viewer can never get closer to the background
  • A Background node describes background colors
  • ground colors and angles - ground gradation
  • sky colors and angles - sky gradation
  • more . . .
  • Background {
        groundColor [ 0.0 0.2 0.7, . . . ]
        groundAngle [ 1.309, 1.571 ]
        skyColor    [ 0.1 0.1 0.0, . . . ]
        skyAngle    [ 1.309, 1.571 ]
    }

    [ back.wrl ]
  • A Background node describes background colors
  • frontUrl - texture image URL for box front
  • etc . . .
  • Background {
         . .
        frontUrl  "mountns.png"
        backUrl   "mountns.png"
        leftUrl   "mountns.png"
        rightUrl  "mountns.png"
        topUrl    "clouds.png"
        bottomUrl "ground.png"
    }

    [ back2.wrl ]
  • Backgrounds describe:
  • Ground and sky color gradients on ground and sky spheres

  • Panorama images on a panorama box
  • The viewer can look around, but never get closer to the background
  • Fog increases realism:
  • Add fog outside to create hazy worlds
  • Add fog inside to create dark dungeons
  • Use fog to set a mood
  • The further the viewer can see, the more you have to model and draw

  • To reduce development time and drawing time, limit the viewer's sight by using fog
  • The fog type selects linear or exponential visibility reduction with distance
  • Linear is easier to control
  • Exponential is more realistic and "thicker"
  • The visibility range selects the distance where the fog reaches maximum thickness
  • Fog is "clear" at the viewer, and gradually reduces visibility
  • Fog has a fog color
  • White is typical, but black, red, etc. also possible
  • Shapes are faded to the fog color with distance

  • The background is unaffected
  • For the best effect, make the background the fog color
  • A Fog node creates colored fog
  • color - fog color
  • type - fog type
  • visibility range - maximum visibility limit
  • Fog {
        color 1.0 1.0 1.0
        fogType "LINEAR"
        visibilityRange 0.0
    }

    [ fog2.wrl ]
  • Fog has a color, a type, and a visibility range

  • Fog can be used to set a mood, even indoors

  • Fog limits the viewer's sight:
  • Reduces the amount of the world you have to build
  • Reduces the amount of the world that must be drawn
  • Sounds can be triggered by viewer actions
  • Clicks, horn honks, door latch noises
  • Sounds can be continuous in the background
  • Wind, crowd noises, elevator music
  • Sounds emit from a location, in a direction, within an area
  • Sounds have two components
  • A sound source providing a sound signal
  • Like a stereo component
  • A sound emitter converts a signal to virtual sound
  • Like a stereo speaker
  • An AudioClip node creates a digital sound source
  • url - a sound file URL
  • pitch - playback speed
  • playback controls, like a TimeSensor node

    AudioClip {
        url "myfile.wav"
        pitch 1.0
        startTime 0.0
        stopTime  0.0
        loop FALSE
    }
  • A MovieTexture node creates a movie sound source
  • url - a texture movie file URL
  • speed - playback speed
  • playback controls, like a TimeSensor node

    MovieTexture {
        startTime 0.0
        stopTime  0.0
        loop FALSE
        speed 1.0
        url "movie.mpg"
    }
  • Supported by the AudioClip node:
  • WAV - digital sound files
  • Good for sound effects
  • MIDI - General MIDI musical performance files
  • MIDI files are good for background music
  • Supported by the MovieTexture node:
  • MPEG - movie file with sound
  • Good for virtual TVs
  • A Sound node describes a sound emitter
  • source - AudioClip or MovieTexture node
  • location and direction - emitter placement
  • more . . .

    Sound {
        source AudioClip { . . . }
        location  0.0 0.0 0.0
        direction 0.0 0.0 1.0
    }
  • A Sound node describes a sound emitter
  • intensity - volume
  • spatialize - use spatialize processing
  • priority - prioritize the sound
  • more . . .

    Sound {
        . . .
        intensity 1.0
        spatialize TRUE
        priority 0.0
    }
  • A Sound node describes a sound emitter
  • minimum and maximum range - area in which sound can be heard

    Sound {
        . . .
        minFront 1.0
        minBack  1.0
        maxFront 10.0
        maxBack  10.0
    }
  • The sound range fields specify two ellipsoids
  • minFront and minFront control an inner ellipsoid
  • maxFront and maxFront control an outer ellipsoid
  • Sound has a constant volume inside the inner ellipsoid

  • Sound drops to zero volume from the inner to the outer ellipsoid
  • AudioClip node:
  • loop FALSE
  • Set startTime from a sensor node
  • Sound node:
  • spatialize TRUE
  • minFront etc. with small values
  • priority 1.0
  • Sound {
        source DEF C4 AudioClip {
            url "tone1.wav"
            pitch 1.0
        }
    }
    ROUTE Touch.touchTime
       TO C4.set_startTime

    [ kbd.wrl ]
  • AudioClip node:
  • loop TRUE
  • startTime 0.0
  • stopTime 0.0
  • Sound node:
  • spatialize TRUE
  • minFront etc. with medium values
  • priority 0.0
  • AudioClip node:
  • loop TRUE
  • startTime 0.0
  • stopTime 0.0
  • Sound node:
  • spatialize FALSE
  • minFront etc. with large values
  • priority 0.0
  • Sound {
        source AudioClip {
            url "willow1.wav"
            loop TRUE
        }
    }
    

    [ ambient.wrl ]
  • An AudioClip node or a MovieTexture node describe a sound source
  • A URL gives the sound file
  • Looping, start time, and stop time control playback
  • A Sound node describes a sound emitter
  • A source node provides the sound
  • Range fields describe the sound volume
  • The further the viewer can see, the more there is to draw

  • If a shape is distant:
  • The shape is smaller
  • The viewer can't see as much detail
  • So... draw it with less detail
  • Varying detail with distance reduces upfront download time, and increases drawing speed
  • To control detail, model the same shape several times
  • high detail for when the viewer is close up
  • medium detail for when the viewer is nearish
  • low detail for when the viewer is distant
  • Usually, two or three different versions is enough, but you can have as many as you want
  • Group the shape versions as levels in an LOD grouping node
  • LOD is short for Level of Detail
  • List them from highest to lowest detail
  • Give the entire group a center point
  • Use a list of ranges for version switch points
  • If you have 3 versions, you need 2 ranges
  • Ranges are hints to the browser
  • range [ 7.5, 12.0 ]
    viewer < 7.5 1st child used
    7.5 <= viewer < 12.0 2nd child used
    12.0 < viewer 3rd child used
  • An LOD grouping node creates a group of shapes describing different versions of the same shape
  • center - the center of the shape
  • range - a list of version switch ranges
  • level - a list of shape versions

    LOD {
        center 0.0 0.0 0.0
        range [ . . . ]
        level [ . . . ]
    }
  • Suggested procedure to make different versions:
  • Make the high detail shape first
  • Copy it to make a medium detail version
  • Move the medium detail shape to a desired switch distance
  • Delete parts that aren't dominant
  • Repeat for a low detail version
  • Lower detail versions should use simpler geometry, fewer textures, and no text
    LOD {
        center 0.0 0.0 0.0
        range [ 7.5, 12.0 ]
        level [
            Inline { url "torch1.wrl" }
            Inline { url "torch2.wrl" }
            Inline { url "torch3.wrl" }
        ]
    }

    [ torches.wrl ]
  • Increase performance by making multiple versions of shapes
  • High detail for close up viewing
  • Lower detail for more distant viewing
  • Group the versions in an LOD node
  • Ordered from high detail to low detail
  • Ranges to select switching distances
  • By default, the viewer enters a world at (0.0, 0.0, 10.0)

  • You can provide your own preferred view points
  • Select the entry point position
  • Select favorite views for the viewer
  • Name the views for a browser menu
  • Viewpoints specify a desired location, an orientation, and a camera field of view lens angle

  • Viewpoints can be transformed using a Transform node

  • The first viewpoint found in a file is the entry point
  • A Viewpoint node specifies a named viewing location
  • position and orientation - viewing location
  • fieldOfView - camera lens angle
  • description - description for viewpoint menu

    Viewpoint {
        position    0.0 0.0 10.0
        orientation 0.0 0.0 1.0 0.0
        fieldOfView 0.785
        description "Entry View"
    }
  • Specify favorite viewpoints in Viewpoint nodes

  • The first viewpoint in the file is the entry viewpoint
  • Different types of worlds require different styles of navigation
  • Walk through a dungeon
  • Fly through a cloud world
  • Examine shapes in a CAD application
  • You can select the navigation type

  • You can describe the size and speed of the viewer's avatar
  • There are four standard navigation types:
  • WALK - walk, pulled down by gravity
  • FLY - fly, unaffected by gravity
  • EXAMINE - examine an object at "arms length"
  • NONE - no navigation, movement controlled by world not viewer!
  • Some browsers support additional navigation types
  • An avatar is a representation of the user in the virtual world and is described by:
  • A visual appearance to other users
  • A height, width, and step height
  • A movement speed
  • Avatar appearance is described by proposed VRML extensions

  • Avatar overall size and speed is controlled by the NavigationInfo node
  • By default, a headlight is placed on the avatar's head and aimed in the head direction

  • You can turn this headlight on and off
  • Most browsers provide a menu option to control the headlight
  • You can also control the headlight with the NavigationInfo node
  • A NavigationInfo node selects the navigation type and avatar characteristics
  • type - navigation style
  • avatarSize and speed - avatar characteristics
  • headlight - headlight on or off
  • NavigationInfo {
        type       "WALK"
        avatarSize [ 0.25, 1.6, 0.75 ]
        speed      1.0
        headlight  TRUE
    }
  • The navigation type specifies how a viewer can move in a world
  • walk, fly, examine, or none
  • The avatar overall size and speed specify the viewer's avatar characteristics
  • Sensing the viewer enables you to trigger animations
  • when a region is visible to the viewer
  • when the viewer is within a region
  • when the viewer collides with a shape
  • The LOD and Billboard nodes are special-purpose viewer sensors with built-in responses
  • There are three types of viewer sensors:
  • A VisibilitySensor node senses if the viewer can see a region

  • A ProximitySensor node senses if the viewer is within a region

  • A Collision node senses if the viewer has collided with shapes
  • VisibilitySensor and ProximitySensor nodes sense a box-shaped region
  • center - region center
  • size - region dimensions
  • Both nodes have similar outputs:
  • enterTime - sends time on visible or region entry
  • exitTime - sends time on not visible or region exit
  • isActive - sends true on entry, false on exit
  • A VisibilitySensor node senses if the viewer can see a region
  • center and size - the region's location and size
  • enterTime and exitTime - sends time on entry/exit
  • isActive - sends true/false on entry/exit
  • DEF DoorSense VisibilitySensor {
        center 0.0 1.75 0.0
        size 3.0 2.5 1.0
    }
    ROUTE DoorSense.enterTime
       TO OpenSound.set_startTime

    [ vis1.wrl ]
  • A ProximitySensor node senses if the viewer is in a region
  • center and size - the region's location and size
  • enterTime and exitTime - sends time on entry/exit
  • isActive - sends true/false on entry/exit
  • more . . .
  • DEF DoorSense ProximitySensor {
        center 0.0 1.75 0.0
        size   6.0 3.5 8.0
    }
    ROUTE DoorSense.enterTime
       TO OpenSound.set_startTime
  • A ProximitySensor node senses if the viewer is in a region
  • position and orientation - sends position and orientation while viewer is in the region
  • DEF DoorSense ProximitySensor {
        . . .
    }
    ROUTE DoorSense.position_changed
       TO PetRobotFollower.set_translation

    [ prox1.wrl ]
  • A Collision grouping node senses shapes within the group
  • Detects if the viewer collides with any shape in the group
  • Automatically stops the viewer from going through the shape
  • Collision occurs when the viewer's avatar gets close to a shape
  • Collision distance is controlled by the avatar size in the NavigationInfo node
  • Collision checking is expensive so, check for collision with a proxy shape instead
  • Proxy shapes are typically extremely simplified versions of the actual shapes
  • Proxy shapes are never drawn
  • A collision group with a proxy shape, but no children, creates an invisible collidable shape
  • Windows and invisible railings
  • Invisible world limits
  • A Collision grouping node senses if the viewer collides with group shapes
  • collide - enable/disable sensor
  • children - children to sense
  • proxy - simple shape to sense instead of children
  • DEF DoorCollide Collision {
        proxy . . .
        children [ . . . ]
    }
    ROUTE DoorCollide.collideTime
       TO OpenSound.set_startTime

    [ collide1.wrl ]
  • Collision is on by default
  • Turn it off whenever possible!
  • However, once a parent turns off collision, a child can't turn it back on!

  • Collision results from viewer colliding with a shape, but not from a shape colliding with a viewer
  • Any number of sensors can sense at the same time
  • You can have multiple visibility, proximity, and collision sensors

  • Sensor areas can overlap

  • If multiple sensors should trigger, they do
  • A VisibilitySensor node checks if a region is visible to the viewer
  • The region is described by a center and a size

  • Time is sent on entry and exit of visibility

  • True/false is sent on entry and exit of visibility
  • A ProximitySensor node checks if the viewer is within a region
  • The region is described by a center and a size

  • Time is sent on viewer entry and exit

  • True/false is sent on viewer entry and exit

  • Position and orientation of the viewer is sent while within the sensed region
  • A Collision grouping node checks if the viewer has run into a shape
  • The shapes are defined by the group's children or a proxy

  • Collision time is sent on contact
  • Many actions are too complex for built-in sensors, interpolators, shapes, etc.
  • Computed animation paths (eg. gravity)
  • Algorithmic shapes (eg. fractals)
  • Collaborative environments (eg. games)
  • You can create new sensors, interpolators, etc., using program scripts written in Java or JavaScript.
  • A Script node describes a script and its interface
  • url - the program script to use
  • more . . .

    Script {
        url "sine1.class"
    or...
        url "sine1.js"
    or...
        url "javascript: ..."
    }
  • Script nodes also declare:
  • fields and events - the interface
  • Each has a name and data type
  • Fields have an initial value
  • Script {
        field SFFloat cycles 2.0
        field SFFloat radius 0.5
        . . .
        eventIn  SFFloat set_fraction
        eventOut SFVec3f position_changed
    }
    DEF Spiral Script { . . .  }
    
    ROUTE Clock.fraction_changed
       TO Spiral.set_fraction
    
    ROUTE Spiral.position_changed
       TO Ball.set_translation

    [ sine1.wrl ]
  • The Script node selects a program script, specified by a URL

  • Program scripts have field and event interface declarations, each with
  • A data type
  • A name
  • An initial value (fields only)
  • Program scripts can be written in Java, JavaScript, and other languages
  • JavaScript is easier to program
  • Java is more powerful
  • SGI browsers support VRMLScript, a derivative of JavaScript

  • Most other browsers support Java

  • For any language, the program script responds to inputs and sends outputs
  • Recall that the Script node declares:
  • fields and events - the interface
  • Script {
        field SFFloat cycles 2.0
        field SFFloat radius 0.5
        . . .
        eventIn  SFFloat set_fraction
        eventOut SFVec3f position_changed
    }
  • The program script implements the node using values from the interface

  • Define one function for each event input
  • Arguments include event value and time
  • function set_fraction( f, tm ) {
        . . .
    }
  • The event function is called when the event is received

  • If multiple events arrive at once, then multiple event functions are called

  • The event function can compute values and send events
  • After some, or all, event functions have been called, the optional eventsProcessed function is called

    function eventsProcessed ( ) {
        . . .
    }
  • The program script can read and write interface field values:

    angle = f * 6.28 * cycles + phase;
  • The program script can write to interface eventOut to send an event:

    position_changed[0] = radius * Math.sin( angle );
    position_changed[1] = f*deltaHeight + startHeight;
    position_changed[2] = radius * Math.sin( angle+1.571 );
  • When the script is first loaded, the initialize function is called
    function initialize ( ) {
        . . .
    }

  • Just before the script is unloaded, the shutdown function is called
    function shutdown ( ) { . . . }
  • Create a Spiral interpolator that computes a spiral path from a fractional time input

  • Fields needed:
  • Path radius
  • Number of cycles
  • Phase
  • Starting and ending height
  • Fraction offset for use with multiple paths
  • DEF Spiral Script {
        field SFFloat cycles 2.0
        field SFFloat radius 0.5
        field SFFloat phase  0.0
        field SFFloat startHeight 1.7
        field SFFloat endHeight   0.3
        field SFFloat fractionOffset 0.0
        . . .
    }
  • Inputs and outputs needed:
  • Fraction input
  • Position output
  • DEF Spiral Script {
        . . .
        eventIn  SFFloat set_fraction
        eventOut SFVec3f position_changed
        . . .
    }
  • Functions needed:
  • set_fraction
  • Script {
        . . .
        url "javascript:
        function set_fraction( f, tm ) {
            . . .
        }"
    }
  • Code needed:
  • Offset the fractional time
  • Compute an angle
  • Compute the total height
  • Compute X, Y, and Z positions
  • function set_fraction( f, tm ) {
        f += fractionOffset;
        while ( f > 1.0 )
            f -= 1.0;
        angle = f * 6.28 * cycles + phase;
        deltaHeight = endHeight - startHeight;
    
        position_changed[0] = radius * Math.sin( angle );
        position_changed[1] = f*deltaHeight + startHeight;
        position_changed[2] = radius * Math.sin( angle+1.571 );
    }
  • Routes needed:
  • Clock into script's set_fraction
  • Script's position_changed into transform
  • ROUTE Clock.fraction_changed
       TO Spiral.set_fraction
    ROUTE Spiral.position_changed
       TO Ball.set_translation

    [ sine1.wrl ]
  • JavaScript functions are called when an event is received

  • The eventsProcessed function is called after all events have been received

  • The initialize and shutdown functions are called at load and unload

  • Functions can get field values, and send event outputs
  • You can create new node types that encapsulate:
  • Shapes
  • Sensors
  • Interpolators
  • Scripts
  • anything else . . .
  • This creates high-level nodes
  • Robots, menus, new shapes, etc.
  • A PROTO statement declares a new node type
  • name - the new node type name
  • fields and events - interface to the prototype
  • PROTO SpiralBall [
        field SFFloat cycles 1.0
        field SFFloat radius 1.0
        . . .
        eventIn SFFloat set_fraction
    ] { . . . }
  • PROTO defines:
  • body - nodes and routes for the new node type
  • PROTO SpiralBall [ . . .  ] {
        Group {
            children [ . . . ]
        }
        ROUTE . . .
    }
  • The IS syntax connects a prototype interface field, eventIn, or eventOut to the body

    PROTO SpiralBall [
        field SFColor ballColor 1.0 1.0 1.0
        . . .
    ] {
        . . . diffuseColor IS ballColor
    }
  • Interface fields may be connected to fields or exposed fields

  • Interface eventIns may be connected to eventIns or exposed fields

  • Interface eventOuts may be connected to eventOuts or exposed fields

  • Interface exposed fields may be connected to exposed fields
  • The new node type can be used like any other type

    DEF Ball1 SpiralBall {
        cycles 2.0
        radius 0.5
    }
    
    ROUTE Clock.fraction_changed
       TO Ball1.set_fraction
  • Recall that node use must be appropriate for the context
  • A Shape node specifies shape, not color
  • A Material node specifies color, not shape
  • A Box node specifies geometry, not shape or color
  • The context for a new node type depends upon the first node in the PROTO body

  • For example, if the first node is a geometry node:
  • The prototype creates a new geometry node type
  • The new node type can be used wherever the first node of the prototype body can be used
  • Create a SpiralBall node type that:
  • Draws a ball at a given color and size
  • Moves that ball along a spiral path
  • Use the spiral program script
  • Fields needed:
  • Ball color and radius
  • All the spiral script fields
  • PROTO SpiralBall [
        field SFColor ballColor  1.0 1.0 1.0
        field SFFloat ballRadius 1.0
        field SFFloat cycles 1.0
        field SFFloat radius 1.0
        field SFFloat phase  0.0
        field SFFloat startHeight 1.0
        field SFFloat endHeight   0.0
        field SFFloat fractionOffset 0.0
        . . .
    ] { . . . }
  • Inputs needed:
  • Fraction input
  • PROTO SpiralBall [
        . . .
        eventIn  SFFloat set_fraction
    ] { . . . }
  • Body needed
  • A group containing all nodes
  • A ball shape inside a transform
  • A spiral path script
  • Route from spiral to ball
  • PROTO SpiralBall [ . . . ] {
        Group {
            children [
                DEF Ball Transform { . . . }
                DEF Spiral Script { . . . }
            ]
        }
        ROUTE . . .
    }

    [ sine2.wrl ]
  • Prototypes are typically in a separate external file

  • An EXTERNPROTO declares a new node type in an external file
  • name, fields, events - as from PROTO
  • url - the URL of the prototype file
  • EXTERNPROTO SpiralBall [
        field SFFloat cycles 1.0
        . . .
    ] "spiral.wrl"
  • PROTO declares a new node type and defines its node body

  • EXTERNPROTO declares a new node type, specified by URL

  • The new node anywhere the first node in the prototype body can be used
    Vector field task
  • By default, an entire texture image is mapped once around the shape

  • You can also:
  • Extract only pieces of interest
  • Create repeating patterns
  • Imagine the texture image is a big piece of rubbery cookie dough

  • Select a texture image piece
  • Define the shape of a cookie cutter
  • Position and orient the cookie cutter
  • Stamp out a piece of texture dough
  • Stretch the rubbery texture cookie to fit a face
  • Texture images (the dough) are in a texture coordinate system

  • S direction is horizontal
  • T direction is vertical
  • (0,0) at lower-left
  • (1,1) at upper-right
  • Texture coordinates and texture coordinate indexes specify a texture piece shape (the cookie cutter)

    0.0 0.0,
    1.0 0.0,
    1.0 1.0,
    0.0 1.0

  • Texture transforms translate, rotate, and scale the texture coordinates (placing the cookie cutter)

  • Bind the texture to a face (stretch the cookie and stick it)

  • Select piece with texture coordinates and indexes
  • Create a cookie cutter
  • Transform the texture coordinates
  • Position and orient the cookie cutter
  • Bind the texture to a face
  • Stamp out the texture and stick it on a face
  • The process is very similar to creating faces!
  • A TextureCoordinate node contains a list of texture coordinates
    TextureCoordinate {
        point [ 0.2 0.2, 0.8 0.2, . . . ]
    }

  • Used as the texCoord field value of IndexedFaceSet or ElevationGrid nodes
  • An IndexedFaceSet geometry node creates geometry out of faces
  • Texture coordinates and indexes - specify texture pieces

    IndexedFaceSet {
        coord Coordinate { . . . }
        coordIndex [ . . . ]
        texCoord TextureCoordinate { . . . }
        texCoordIndex [ . . . ]
    }
  • An ElevationGrid geometry node creates terrains
  • Texture coordinates - specify texture pieces
  • Automatically generated texture coordinate indexes

    ElevationGrid {
        height [ . . . ]
        texCoord TextureCoordinate { . . . }
    }
  • An Appearance node describes overall shape appearance
  • textureTransform - the transform

    Appearance {
        material Material { . . . }
        textureTransform TextureTransform { . . . }
    }
  • A TextureTransform node transforms texture coordinates
  • translation - position
  • rotation - orientation
  • scale - size

    TextureTransform {
        translation . . .
        rotation    . . .
        scale       . . .
    }

  • [ pizza.wrl ]

    [ brickb.wrl ]

    [ fence.wrl ]
  • Texture images are in a texture coordinate system

  • Texture coordinates and indexes describe a texture cookie cutter

  • Texture transforms translate, rotate, and scale place the cookie cutter

  • Texture indexes bind the cut-out cookie texture to a face on a shape
  • By default, shapes are shaded with faceted shading

  • You can request smooth shading

  • In special cases, you can give detailed shading control using normals

  • For shading tricks, you can animate normals
  • A normal defines the facing direction of a face

  • By default, automatically generated normals create faceted shading

  • You can do smooth shading using the creaseAngle field for
  • IndexedFaceSet
  • ElevationGrid
  • Extrusion
  • A crease angle is a threshold angle between two faces

  • If face angle >= crease angle, use facet shading

  • If face angle < crease angle, use smooth shading

  • crease angle = 0

    crease angle = 45 deg
  • A Normal node contains a list of normal vectors that override use of a crease angle

    Normal {
        vector [ 0.0 1.0 0.0, . . . ]
    }

  • Usually automatically generated normals are good enough

  • Normals can be given for IndexedFaceSet and ElevationGrid nodes
  • An IndexedFaceSet geometry node creates geometry out of faces
  • Normal vectors - list of normals
  • Normal indexes - selects normals from list (just like selecting coordinates)
  • Normal binding - control normal binding
  • IndexedFaceSet {
        coord Coordinate { . . . }
        coordIndex [ . . . ]
        normal Normal { . . . }
        normalIndex [ . . . ]
        normalPerVertex TRUE
    }
  • The normalPerVertex field controls how normal indexes are used
  • FALSE: one normal index to each face (ending at -1 coordinate indexes)

  • TRUE: one normal index to each coordinate index of each face (including -1 coordinate indexes)
  • An ElevationGrid geometry node creates terrains
  • Normal vectors - list of normals
  • Normal indexes - selects normals from list (just like selecting coordinates)
  • Normal binding - control normal binding
  • ElevationGrid {
        height [ . . . ]
        normal Normal { . . . }
        normalPerVertex TRUE
    }
  • The normalPerVertex field controls how normal indexes are used (similar to face sets)
  • FALSE: one normal to each grid square

  • TRUE: one normal to each height for each grid square
  • A NormalInterpolator node describes a normal path
  • keys - key fractions
  • values - key normal lists (X,Y,Z lists)

    NormalInterpolator {
        key [ 0.0, . . . ]
        keyValue [ 0.0 1.0 1.0, . . . ]
    }
  • The creaseAngle field controls faceted or smooth shading for automatically generated normals

  • The Normal node lists normal vectors to use for parts of a shape
  • Used as the value of the normal field
  • Normal indexes select normals to use
  • The normalPerVertex field selects normal per face/grid square or normal per coordinate

  • The NormalInterpolator node converts times to normals
  • By default, shapes are shaded with a matte finish

  • Using additional fields of the Material node, you can make them shiny

  • Shiny specular highlights are pseudo-reflections of lights
  • Smooth-shaded shapes produce better highlights
  • A Material node controls shape material attributes
  • diffuse color - main shading color
  • specular color - highlight color
  • shininess - highlight size
  • ambient intensity - ambient lighting effects
  • Material {
        diffuseColor 0.22 0.15 0.00
        specularColor 0.71 0.70 0.56
        shininess 0.16
        ambientIntensity 0.4
    }
    DescriptionambientIntensitydiffuseColorspecularColorshininess
    Aluminum0.30.30 0.30 0.500.70 0.70 0.800.10
    Copper0.260.30 0.11 0.000.75 0.33 0.000.08
    Gold0.40.22 0.15 0.000.71 0.70 0.560.16
    Metalic Purple0.170.10 0.03 0.220.64 0.00 0.980.20
    Metalic Red0.150.27 0.00 0.000.61 0.13 0.180.20
    Plastic Blue0.100.20 0.20 0.710.83 0.83 0.830.12
  • The diffuseColor field controls the matte color

  • The specularColor and shininess fields control the color and size of the highlight

  • The ambientIntensity field controls the effect of ambient light
  • After you've created a great world, sign it!

  • You can provide a title and a description embedded within the file
  • A WorldInfo node provides title and description information for your world
  • title - the name for your world
  • info - any additional information
  • WorldInfo {
        title "My Masterpiece"
        info  [ "copyright (c) 1997 Me." ]
    }
  • Several VRML extensions are in progress
  • External authoring interface (EAI)
  • Binary file format
  • Multi-user framework
  • The EAI enables Java applets to access a VRML browser plug-in, and its world

  • The Java applet can
  • Access some browser features
  • Add VRML content
  • Send events
  • Receive events
  • Using standard Java features, the Java applet can
  • Create graphical user interfaces
  • Manage complex data structures
  • Access the Internet
  • Use text input
  • The increased power of Java applets enables application building rather than just content building
  • World builders
  • Visualization applications
  • Network games
  • Virtual reality chat spaces
  • The binary file format enables smaller files for faster download

  • The binary file format includes
  • Binary representation of nodes and fields
  • Support for prototypes
  • Geometry compression
  • Most authoring will be done with world builders that output binary VRML files directly

  • Hand-authored text VRML will be compiled to the binary format

  • Converters back to text VRML will become available
  • Comments will be lost by translation
  • WorldInfo nodes will be retained
  • Several proposals in progress to create a framework for multi-user worlds
  • Shared objects and spaces
  • Piloted objects (like avatars)
  • Common avatar descriptions
  • Working groups are considering extensions and refinements of VRML for different application areas, including
  • Scientific visualization
  • Databases
  • Network games
  • The VRML 2.0 specification
    http://vag.vrml.org/VRML2.0/FINAL

  • The VRML Repository
    http://www.sdsc.edu/vrml

  • SGI's VRML Cafe
    http://vrml.sgi.com/cafe
  • VRMLSite Magazine
    http://www.vrmlsite.com

  • NetscapeWorld Magazine
    http://www.netscapeworld.com
  • Books on VRML 2.0... we recommend (shameless plug)

    The VRML 2.0 Sourcebook
    by Ames, Nadeau, Moreland
    published by John Wiley & Sons