Wednesday, May 25, 2005

Solutions to Previous Issues

Coordinate Mapping:

The coordinate mapping seems to work now. Code from simplehapticscene was incorporated into the project prototype. This gets the dimensions of the haptic workspace and creates the openGl viewing frustrum from these dimensions.

Collisions and Forces:

Collisions with the haptic cursor and the TRI object have been tested. There are scaling issues between the actual contact points of the TRI object and the cursor. The cursor can also

Collisions occur and forces are rendered. The SWIFT package has a function to query the distance to collision. This will also return a normal vector in the direction of the collision. This is the direction to render the force.

Orientation

Prototype currently reads in 3 points from the stock material to create the final translation thanks to code provided by Professor Horsch. This is created by reading in three points, the origin, a point along the positive x-axis and a point along the positive y-axis. The program calculates the coordinate system from these points. This will have to be documented later.

This code will be extended to read in a fourth point with an added restriction. This code is meant to get the orientation. I will extend the code to allow the dimensions to be read in. the fourth point will be along the x-axis. The restriction is that all points will have to be at corners.

This means that the stock material does not have to have a specific starting position. Once the stock material is secured to the working surface, the position is read in. This can be extended to allow the workpiece to be moved after the drilling process has been started. This assumes that the origin was not removed. Repositioning the workpiece can cause accuracy issues.

Friday, May 20, 2005

Week Update May 20, 2005

Unofficial News:

Last week's weekly update somehow got lost, or never written, so this is a two week update. In my opinion the project is progressing well. Later next month the drilling device should be attached and tested. The project now incorporates the collision detection package SWIFT. The system doesn't seem slow. However, the ultimate test will be loading the STL file and surrounding it with the voxel shell.

Voxel Headache:

The voxel shell is the main problem with the system. This is because its shape is dynamic. The STL or TRI objects are compiled by openGl using display lists. The objects are first loaded into a temporary storage, then stored using openGl display lists. The display list stores the shape at compile time. The program does not need to store the object after the shape is given to openGl. Drawing the shape is then only one line of code.

Another issue is finding which cells of the voxel object need to be updated after a collision occurs. If the sphere contacts only a part of a voxel, should it be removed? What other issues exist?

SWIFT News:

The package was incorporated into stlViewer first and then into Prototype. Under Prototype the collision detection algorithm did not seem to slow down the system. This needs more testing, but this is a very good sign. The step of voxelizing the STL object will be left out of the system unless a noticable system lag occurs.

Status:

SWIFT Collision detection is tested and working.
TRI files are used to load the objects.
The haptic device can move the cursor on screen.
HDAPI is used to update position, force and use collision detection

Issues:

Mapping haptic coordinates to the screen.
Test collision detection with haptic cursor and TRI object.
Surround STL/TRI object with voxel shell.
Moving voxel shell, coordinate system, calibration, other related issues.
Generate correct force vector.

Thursday, May 19, 2005

Implementing SWIFT

Project stlViewer was modified to incorporate the SWIFT package. The project uses SWIFT to load two spherical objects from a given TRI file. The SWIFT source code was modified to allow the vertex data to be retreived from it. The program has a TRI file reader and stores data as a mesh. A function was added to return a struct, SOLID, that contains the array of every face and each face having 3 vertices.

stlViewer loads the two sphere objects and draws them using openGl. One sphere can be moved and when a collision occurs the console prints that a collision has occured.

SWIFT will be linked with haptic thread. The haptics thread will send the rotation and translation information to the graphics thread and the SWIFT package. This is because the only moving object is the haptic device/drilling tool. The SWIFT package offers a function that returns the distance between the two objects. This will be used as a spring force.

Saturday, May 14, 2005

Building SWIFT: Issues and Tips

Package Details:

The SWIFT collision detection package is available from the UNC software page. The download is in the form of a ZIP file. The ZIP file contains the source code, documentation and an example project. Compiling the example project in .Net 2003 is straightforward after unpacking the ZIP file. The files are written in C++. The files swift.h and swift.lib must be included into a project to use the SWIFT collision detection package.

Rebuilding SWIFT:

The thesis project requires using a multi-threaded DLL runtime library because of the OpenHaptics Toolkit. The example SWIFT file is set to build using the single threaded runtime library. To test the SWIFT library the example project was build using the multi-threaded DLL option. To successfully build the example and the thesis project with the SWIFT library included, the SWIFT package must be rebuilt.

Rebuilding Source Code
  1. Open Project file swift.dsw
  2. Change Build Configuration from Debug to Release
  3. Change Project Properties->C/C++->Code Generation->Runtime Library
    • Multi-threaded DLL (/MD)
  4. Build Project
    • If Link errors occur see below

Including the SWIFT library into Existing Projects

  1. Check that Project Properties
    • C/C++->Code Generation->Runtime Library is set to Multi-threaded DLL (/MD)
    • C/C++->General->Additional Include Directories add path to swift.h header file. (ex. ..\swift\include\ )
    • Linker->General->Additional Library Directories add path to the rebuilt swift.lib file. (ex. ...\swift\release\ )
    • Linker->Input->Additional Dependencies add swift.lib
  2. Include swift.h

Possible Error Messages VC++ .Net 2003

If Link errors occur and complain about missing files it is caused by switching the runtime libraries. The missing files are a part of Visual Studio 6. If you don't have Visual Studio 6, trying searching the internet for the files or find someone with a copy of these files. Copy the Library and DLL files from a Visual Studio Directory to the windows/system32 directory or place them in the project folder.

Thursday, May 12, 2005

STL files and Collision Detection

Swift Custom File Formats

The UNC collision detection packages offer speed and flexibility. The file types that are used contain all the sets of vertices and faces that represent the object. The faces for this project will be triangles, although this is not a requirement. The UNC collision detection package SWIFT uses a specialized file format named TRI. All vertices are listed, then the faces are built from references these vertices.

The format for a TRI file is

TRI
nv = number of vertices
nf = number of faces
coordinates = list of the vertex position coordinates as reals.
There are 3*nv coordinates.
face indices = list of the vertex indices given in CCW orientation
for each face. There are 3*nf indices.

[1]

The main difference between STL files and TRI files are the way they represent the triangles. STL files specify coordinates for each vertex in each triangle (face). This is inefficient because in a closed object a vertex can share multiple faces. The vertex is repeated. The TRI file format, among others, lists all possible vertices first. Then, the triangle faces are created by referencing 3 vertices.

Affect

There is a file format TRIS that is similar to an STL file. Techniques on converting STL files will be researched. Currently I have found a program to convert a STL file to an .OBJ file.

For an explanation of why the project needs collision detection, see force-rendering-and-collision.html.

[1] Ehman, Stephan. SWIFT Speedy Walking via Improved Feature Testing. Application Manual. http://www.cs.unc.edu/􀀀 geom/SWIFT/

Tuesday, May 10, 2005

Possible Collision Detection Solutions

Since the HLAPI cannot be modified it cannot be used in this project. This means that we don't get the benefit of automatic collision detection and haptics rendering provided by the HLAPI. A collision detection package must be incorporated into the project to help render forces when contact with the STL object occurs. UNC chapel hill's team gamma has an impressive page on Collision Detection, Haptics and Robotics. Professor Horsch has suggested implementing one of the two packages:

  • SWIFT
  • PQP

SWIFT claims speed, while PQP seems to be easier to implement. Both packages will be explored over the next few weeks.

Friday, May 06, 2005

Week Update May 6, 2005

Project Prototype:

Voxels
The voxel object was incorporated into the project. Currently it is an example provided by Prof. Horsch. The code will be modified to create the cuboid voxel shell. The voxel object will have to surround the STL object.

Forces and Collision
When a collision occurs a force will be sent to the haptic device. The focus of the next few weeks will be resetting and adjusting forces.

Project stlViewer:

Voxel object was modified. Each voxel is now drawn as a 3D coordinate using 3 lines rather than 6 planes.

Progress

A drilling tool was provided by Prof. Dr. Horsch. Next week I will enlist the help of the Darmstadt machining lab to attach the tool to the haptic device

Outlook

By June 3 demo

  • Align voxel object and STL Import voxel object using haptic device.
  • Feedback to simulate a collision and resist penetration.
Next phase

  • Drilling tool should be attached.
  • Experiment drilling with simple shapes
  • Collision Detection with STL object

Thread Safety .. or Ignoring Thread Safety

Problem:
The voxel object needs to be drawn by the graphics thread and updated by the haptics thread. The Haptics Programmers PDF suggests creating a snapshot for shared data because the haptics thread runs considerably faster and can change the data while the graphics thread is attempting to draw the object. Creating a snapshot would mean that at the worst case a 3D array 128^3 would have to be copied 60 times per second.

Partial Solutions and Assumptions:
The graphics thread is a read-only thread while the haptics thread is a write-only thread. The collision detection exists in the haptic thread so the data needs to be correct. The proposed solution is to overlook the possibility of inconsistant data for the graphic thread and have one object shared between the two threads. It is unknown if there will be a noticable display lag in the graphics thread. This issue will be tracked throughout the course of the project.

Wednesday, May 04, 2005

Terms and Definitions

Callback -- Function that is set to run on the scheduler. These can set responses to user events or query device state. They can be Asynchronous or Synchronous. callbacks-and-scheduler-from-hdapi
Graphics Thread -- see Scheduler
Haptic Device -- Phantom Omni 6 DOF device. Can render force feedback in 3 dimensions. sensable-omni-phantom
Haptic Thread -- see Scheduler
HDAPI --Haptic Device API. This allows low a level interface with the haptic device. see hdapi-vs-hlapi
HLAPI -- Haptic Library API. This builds on top of the HDAPI and allows collision detection based on openGL methods. see hdapi-vs-hlapi
Octree -- Storage technique for representing 3D objects. This is an extension from quadtrees. A cube is divided into 8 equal sized cubes. There are multiple levels each level having 8 children. This storage technique is meant to enhance performace by reducing cells to check. Eg. An array would check each item in the array. An octree is meant to quickly dismiss branches that are unnecessary.
OpenGl -- A graphics programming API allowing 2 and 3 dimensional manipulation. multi-language multi-platform
Scheduler -- This is an infinite loop that responds to events based on callbacks. Events can be from user devices or based on idle time. The scheduler will execute at a set frequency. openGl and the HDAPI use seperate schedulers. OpenGl is around 60 Hz while the Haptic scheduler is at 1000 Hz. callbacks-and-scheduler-from-hdapi
STL file -- STereoLithography. A triangle mesh that forms a shell of a solid object. All faces are listed. Each face has a normal vector and then 3 vectors with 3 coordinates each. see stl-files-and-collision-detection
STL Object -- This represents the desired shape in the virtual environment consisting of openGL triangles. The triangle data is loaded from an STL file. Forces should be rendered to prevent the tool tip to penetrate this shape.
Stock Material -- Real world drilling material. This will be simulated by the Voxel Object
SWIFT -- University of North Carolina's collision detection package. SWIFT
TCP -- Tool Center Point. This is the endpoint of the kinematic chain representing the tooltip of the Haptic device.
TRI file -- a file consisting of faces and vertex points that represent a triangle mesh that forms a shell of a solid object. First all unique vertice positions are listed. Then all triangles are formed by referencing 3 listed vertice positions. see stl-files-and-collision-detection
Voxel -- Volume Pixel, used to represent 3D objects. see voxels-what-to-know
Voxel Object -- The 3D array that surrounds the STL Object and simulates the Stock Material. Each cell has a value associated with it 0 is Empty, 1 is Inner, and 2 is the Body. Cells marked Body are the only cells drawn. The Voxel object starts as a cuboid shell.
Voxel Shell -- All cells of the Voxel Object that have value Body. This simulates the current progress and should mirror the actual milling material.

Programs:

stlViewer: Initial project. First was used to load and display STL objects from a file. Then haptics were added. Voxels were added next. Collision detection and force rendering were explored using this project. This project relies on the HLAPI. This project was converted into prototype. Recently this was extended to use the SWIFT collision detection package.

prototype: working Project. This is the conversion of the project stlViewer to use the HDAPI. This will have custom force rendering and the SWIFT collision detection.

Tuesday, May 03, 2005

Force Assumptions and Method

The haptic device can only render forces in 3 dimensions. The tooltip has no torque feedback so for this project the milling must be perpendicular to the real world material (voxel object). When the tool center point (TCP) enters the voxel object then force rendering starts.

First the force axis and force direction are computed from the initial contact point


  • Force axis -- The x, y or z plane that the force will be rendered in
  • force direction -- The direction in the plane to render the force. initial point - current point

While the TCP lies within the voxel object the force axis and force direction will remain the same. The user must exit the voxel object to reset the force axis.

Proper use of the milling--


  • Tool is held perpendicular to the voxel object at all times
  • Milling at an angle will cause unexpected results and is unsafe!
  • The tool should be exited from the voxel shell when changing the milling axis

Force Rendering and HLAPI News

Professor Horsch and I sent an email to the support department at SensAble to check if the HLAPI (Haptic Library API) could be modified to move the tool center point. I was informed by Prof. Horsch that currently the HLAPI cannot be modified. The next release of the development kit is set for July/August and this feature may be included.

The project will use the HDAPI (Haptic Device API) and all forces will be rendered manually. There will be different forces.


  • A lock position -- This could be used to 'hang' the tool. Force would be applied to prevent the tool from moving. This point would be away from the material
  • Constant friction -- this might assist the overall experience. **will explain if included**
  • Voxel Contact -- The force cannot be too great, but the user must feel contact with the voxel shell. The voxel shell is deformable. Force Rendered in opposite direction of mill direction
  • STL Voxel Shell -- **If included** Provide stronger feedback, yet not highest level
  • STL object -- Highest resisting force

The initial point of contact with the voxel shell can be tracked. While still in contact with voxels, the force is rendered in the vector from current position to initial point of contact.

Assumption

Voxel object has collapsing outer shell. In the end all voxels in the voxel object should lie inside of the STL Object.

STL Voxels **if included** are not drawn

New Issue


Collision Detection for STL object - Cylinder to set of static triangle planes

Sunday, May 01, 2005

Week Update April 29, 2005

Current Work:

Prototype project was created. This is the combination of stlViewer, cubeTest, slidingContact and various parts from other haptics example projects. The Prototype project is the conversion of stlViewer to use the HDAPI

Currently the project inputs an STL file. It has simple lighting and graphics. The viewing angle is controlled by the keyboard, but rotations aren't included. The haptic interface is setup, but forces are not yet rendered. The outline for the Prototype project follows the example project slidingContact.

New Issues/Questions:

  • Collision Detection between cursor (cylinder) and STL object (triangle planes)
  • Attaching drilling tool to haptic device
  • Fitting STL object into an imported Voxel Shell
    • Scaling
    • Orientation
  • Method for computing Forces
    • How? -- hdSet (...) --
    • What are the max forces?
    • How the forces change between contact with objects

Counters